[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 30582 1726855263.31689: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-ZzD executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 30582 1726855263.32115: Added group all to inventory 30582 1726855263.32117: Added group ungrouped to inventory 30582 1726855263.32121: Group all now contains ungrouped 30582 1726855263.32124: Examining possible inventory source: /tmp/network-Koj/inventory.yml 30582 1726855263.52286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 30582 1726855263.52394: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 30582 1726855263.52418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 30582 1726855263.52484: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 30582 1726855263.52567: Loaded config def from plugin (inventory/script) 30582 1726855263.52569: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 30582 1726855263.52613: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 30582 1726855263.52712: Loaded config def from plugin (inventory/yaml) 30582 1726855263.52715: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 30582 1726855263.52807: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 30582 1726855263.53341: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 30582 1726855263.53344: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 30582 1726855263.53347: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 30582 1726855263.53355: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 30582 1726855263.53363: Loading data from /tmp/network-Koj/inventory.yml 30582 1726855263.53437: /tmp/network-Koj/inventory.yml was not parsable by auto 30582 1726855263.53479: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 30582 1726855263.53511: Loading data from /tmp/network-Koj/inventory.yml 30582 1726855263.53567: group all already in inventory 30582 1726855263.53572: set inventory_file for managed_node1 30582 1726855263.53575: set inventory_dir for managed_node1 30582 1726855263.53576: Added host managed_node1 to inventory 30582 1726855263.53577: Added host managed_node1 to group all 30582 1726855263.53578: set ansible_host for managed_node1 30582 1726855263.53584: set ansible_ssh_extra_args for managed_node1 30582 1726855263.53589: set inventory_file for managed_node2 30582 1726855263.53592: set inventory_dir for managed_node2 30582 1726855263.53593: Added host managed_node2 to inventory 30582 1726855263.53594: Added host managed_node2 to group all 30582 1726855263.53595: set ansible_host for managed_node2 30582 1726855263.53595: set ansible_ssh_extra_args for managed_node2 30582 1726855263.53597: set inventory_file for managed_node3 30582 1726855263.53598: set inventory_dir for managed_node3 30582 1726855263.53599: Added host managed_node3 to inventory 30582 1726855263.53600: Added host managed_node3 to group all 30582 1726855263.53600: set ansible_host for managed_node3 30582 1726855263.53600: set ansible_ssh_extra_args for managed_node3 30582 1726855263.53603: Reconcile groups and hosts in inventory. 30582 1726855263.53605: Group ungrouped now contains managed_node1 30582 1726855263.53607: Group ungrouped now contains managed_node2 30582 1726855263.53607: Group ungrouped now contains managed_node3 30582 1726855263.53660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 30582 1726855263.53741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 30582 1726855263.53772: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 30582 1726855263.53792: Loaded config def from plugin (vars/host_group_vars) 30582 1726855263.53793: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 30582 1726855263.53798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 30582 1726855263.53804: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 30582 1726855263.53832: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 30582 1726855263.54060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855263.54131: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 30582 1726855263.54154: Loaded config def from plugin (connection/local) 30582 1726855263.54156: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 30582 1726855263.54536: Loaded config def from plugin (connection/paramiko_ssh) 30582 1726855263.54539: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 30582 1726855263.55086: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30582 1726855263.55129: Loaded config def from plugin (connection/psrp) 30582 1726855263.55131: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 30582 1726855263.55827: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30582 1726855263.55864: Loaded config def from plugin (connection/ssh) 30582 1726855263.55867: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 30582 1726855263.57257: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30582 1726855263.57282: Loaded config def from plugin (connection/winrm) 30582 1726855263.57285: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 30582 1726855263.57307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 30582 1726855263.57348: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 30582 1726855263.57390: Loaded config def from plugin (shell/cmd) 30582 1726855263.57391: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 30582 1726855263.57409: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 30582 1726855263.57446: Loaded config def from plugin (shell/powershell) 30582 1726855263.57447: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 30582 1726855263.57481: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 30582 1726855263.57583: Loaded config def from plugin (shell/sh) 30582 1726855263.57584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 30582 1726855263.57611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 30582 1726855263.57679: Loaded config def from plugin (become/runas) 30582 1726855263.57681: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 30582 1726855263.57790: Loaded config def from plugin (become/su) 30582 1726855263.57792: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 30582 1726855263.57885: Loaded config def from plugin (become/sudo) 30582 1726855263.57889: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 30582 1726855263.57911: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30582 1726855263.58123: in VariableManager get_vars() 30582 1726855263.58137: done with get_vars() 30582 1726855263.58238: trying /usr/local/lib/python3.12/site-packages/ansible/modules 30582 1726855263.61124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 30582 1726855263.61243: in VariableManager get_vars() 30582 1726855263.61254: done with get_vars() 30582 1726855263.61258: variable 'playbook_dir' from source: magic vars 30582 1726855263.61258: variable 'ansible_playbook_python' from source: magic vars 30582 1726855263.61259: variable 'ansible_config_file' from source: magic vars 30582 1726855263.61260: variable 'groups' from source: magic vars 30582 1726855263.61261: variable 'omit' from source: magic vars 30582 1726855263.61261: variable 'ansible_version' from source: magic vars 30582 1726855263.61262: variable 'ansible_check_mode' from source: magic vars 30582 1726855263.61263: variable 'ansible_diff_mode' from source: magic vars 30582 1726855263.61263: variable 'ansible_forks' from source: magic vars 30582 1726855263.61264: variable 'ansible_inventory_sources' from source: magic vars 30582 1726855263.61265: variable 'ansible_skip_tags' from source: magic vars 30582 1726855263.61265: variable 'ansible_limit' from source: magic vars 30582 1726855263.61266: variable 'ansible_run_tags' from source: magic vars 30582 1726855263.61267: variable 'ansible_verbosity' from source: magic vars 30582 1726855263.61307: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml 30582 1726855263.62010: in VariableManager get_vars() 30582 1726855263.62030: done with get_vars() 30582 1726855263.62086: in VariableManager get_vars() 30582 1726855263.62104: done with get_vars() 30582 1726855263.62154: in VariableManager get_vars() 30582 1726855263.62167: done with get_vars() 30582 1726855263.62222: in VariableManager get_vars() 30582 1726855263.62237: done with get_vars() 30582 1726855263.62290: in VariableManager get_vars() 30582 1726855263.62333: done with get_vars() 30582 1726855263.62403: in VariableManager get_vars() 30582 1726855263.62418: done with get_vars() 30582 1726855263.62471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 30582 1726855263.62485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 30582 1726855263.62805: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 30582 1726855263.62996: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 30582 1726855263.62999: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 30582 1726855263.63038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 30582 1726855263.63066: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 30582 1726855263.63272: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 30582 1726855263.63337: Loaded config def from plugin (callback/default) 30582 1726855263.63340: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 30582 1726855263.64739: Loaded config def from plugin (callback/junit) 30582 1726855263.64742: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 30582 1726855263.64786: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 30582 1726855263.64858: Loaded config def from plugin (callback/minimal) 30582 1726855263.64860: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 30582 1726855263.64899: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 30582 1726855263.64960: Loaded config def from plugin (callback/tree) 30582 1726855263.64963: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 30582 1726855263.65082: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 30582 1726855263.65085: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_states_nm.yml ************************************************** 2 plays in /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30582 1726855263.65126: in VariableManager get_vars() 30582 1726855263.65139: done with get_vars() 30582 1726855263.65146: in VariableManager get_vars() 30582 1726855263.65154: done with get_vars() 30582 1726855263.65159: variable 'omit' from source: magic vars 30582 1726855263.65198: in VariableManager get_vars() 30582 1726855263.65215: done with get_vars() 30582 1726855263.65237: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_states.yml' with nm as provider] *********** 30582 1726855263.68821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 30582 1726855263.68922: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 30582 1726855263.69123: getting the remaining hosts for this loop 30582 1726855263.69125: done getting the remaining hosts for this loop 30582 1726855263.69128: getting the next task for host managed_node3 30582 1726855263.69132: done getting next task for host managed_node3 30582 1726855263.69134: ^ task is: TASK: Gathering Facts 30582 1726855263.69135: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855263.69137: getting variables 30582 1726855263.69139: in VariableManager get_vars() 30582 1726855263.69150: Calling all_inventory to load vars for managed_node3 30582 1726855263.69152: Calling groups_inventory to load vars for managed_node3 30582 1726855263.69155: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855263.69166: Calling all_plugins_play to load vars for managed_node3 30582 1726855263.69176: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855263.69180: Calling groups_plugins_play to load vars for managed_node3 30582 1726855263.69215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855263.69268: done with get_vars() 30582 1726855263.69275: done getting variables 30582 1726855263.69524: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 Friday 20 September 2024 14:01:03 -0400 (0:00:00.045) 0:00:00.045 ****** 30582 1726855263.69547: entering _queue_task() for managed_node3/gather_facts 30582 1726855263.69548: Creating lock for gather_facts 30582 1726855263.70421: worker is 1 (out of 1 available) 30582 1726855263.70437: exiting _queue_task() for managed_node3/gather_facts 30582 1726855263.70450: done queuing things up, now waiting for results queue to drain 30582 1726855263.70453: waiting for pending results... 30582 1726855263.70793: running TaskExecutor() for managed_node3/TASK: Gathering Facts 30582 1726855263.70971: in run() - task 0affcc66-ac2b-aa83-7d57-00000000001b 30582 1726855263.71104: variable 'ansible_search_path' from source: unknown 30582 1726855263.71133: calling self._execute() 30582 1726855263.71360: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855263.71364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855263.71366: variable 'omit' from source: magic vars 30582 1726855263.71577: variable 'omit' from source: magic vars 30582 1726855263.71676: variable 'omit' from source: magic vars 30582 1726855263.71725: variable 'omit' from source: magic vars 30582 1726855263.71837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855263.71992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855263.72093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855263.72097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855263.72099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855263.72101: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855263.72104: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855263.72229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855263.72426: Set connection var ansible_timeout to 10 30582 1726855263.72435: Set connection var ansible_connection to ssh 30582 1726855263.72453: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855263.72797: Set connection var ansible_pipelining to False 30582 1726855263.72801: Set connection var ansible_shell_executable to /bin/sh 30582 1726855263.72803: Set connection var ansible_shell_type to sh 30582 1726855263.72806: variable 'ansible_shell_executable' from source: unknown 30582 1726855263.72808: variable 'ansible_connection' from source: unknown 30582 1726855263.72810: variable 'ansible_module_compression' from source: unknown 30582 1726855263.72812: variable 'ansible_shell_type' from source: unknown 30582 1726855263.72814: variable 'ansible_shell_executable' from source: unknown 30582 1726855263.72820: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855263.72822: variable 'ansible_pipelining' from source: unknown 30582 1726855263.72824: variable 'ansible_timeout' from source: unknown 30582 1726855263.72826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855263.73055: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 30582 1726855263.73070: variable 'omit' from source: magic vars 30582 1726855263.73079: starting attempt loop 30582 1726855263.73086: running the handler 30582 1726855263.73108: variable 'ansible_facts' from source: unknown 30582 1726855263.73139: _low_level_execute_command(): starting 30582 1726855263.73153: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855263.74442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855263.74556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855263.74571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855263.74582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855263.74703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855263.74716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855263.74779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855263.74895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855263.76600: stdout chunk (state=3): >>>/root <<< 30582 1726855263.76792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855263.76807: stdout chunk (state=3): >>><<< 30582 1726855263.76822: stderr chunk (state=3): >>><<< 30582 1726855263.76936: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855263.76940: _low_level_execute_command(): starting 30582 1726855263.77014: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402 `" && echo ansible-tmp-1726855263.7691743-30612-234758809968402="` echo /root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402 `" ) && sleep 0' 30582 1726855263.77848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855263.77851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855263.77854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855263.77856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855263.77859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855263.77867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855263.77927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855263.77998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855263.80111: stdout chunk (state=3): >>>ansible-tmp-1726855263.7691743-30612-234758809968402=/root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402 <<< 30582 1726855263.80115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855263.80117: stdout chunk (state=3): >>><<< 30582 1726855263.80120: stderr chunk (state=3): >>><<< 30582 1726855263.80136: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855263.7691743-30612-234758809968402=/root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855263.80175: variable 'ansible_module_compression' from source: unknown 30582 1726855263.80238: ANSIBALLZ: Using generic lock for ansible.legacy.setup 30582 1726855263.80294: ANSIBALLZ: Acquiring lock 30582 1726855263.80297: ANSIBALLZ: Lock acquired: 140270807060400 30582 1726855263.80300: ANSIBALLZ: Creating module 30582 1726855264.32594: ANSIBALLZ: Writing module into payload 30582 1726855264.32598: ANSIBALLZ: Writing module 30582 1726855264.32939: ANSIBALLZ: Renaming module 30582 1726855264.32942: ANSIBALLZ: Done creating module 30582 1726855264.32945: variable 'ansible_facts' from source: unknown 30582 1726855264.32947: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855264.32949: _low_level_execute_command(): starting 30582 1726855264.32951: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 30582 1726855264.34254: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855264.34269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855264.34279: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855264.34608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855264.34636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855264.36125: stdout chunk (state=3): >>>PLATFORM <<< 30582 1726855264.36216: stdout chunk (state=3): >>>Linux <<< 30582 1726855264.36228: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 30582 1726855264.36348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855264.36402: stderr chunk (state=3): >>><<< 30582 1726855264.36410: stdout chunk (state=3): >>><<< 30582 1726855264.36430: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855264.36446 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 30582 1726855264.36801: _low_level_execute_command(): starting 30582 1726855264.36804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 30582 1726855264.36859: Sending initial data 30582 1726855264.36862: Sent initial data (1181 bytes) 30582 1726855264.37528: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855264.37544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855264.37558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855264.37576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855264.37597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855264.37654: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855264.37762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855264.37777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855264.37908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855264.41328: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 30582 1726855264.42098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855264.42102: stdout chunk (state=3): >>><<< 30582 1726855264.42104: stderr chunk (state=3): >>><<< 30582 1726855264.42107: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855264.42109: variable 'ansible_facts' from source: unknown 30582 1726855264.42111: variable 'ansible_facts' from source: unknown 30582 1726855264.42112: variable 'ansible_module_compression' from source: unknown 30582 1726855264.42134: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30582 1726855264.42239: variable 'ansible_facts' from source: unknown 30582 1726855264.42599: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/AnsiballZ_setup.py 30582 1726855264.43405: Sending initial data 30582 1726855264.43409: Sent initial data (154 bytes) 30582 1726855264.44804: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855264.44809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855264.44811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855264.44814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855264.44816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855264.45079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855264.45086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855264.45157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855264.45255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855264.46909: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855264.46949: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855264.47114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855264.47202: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppgh5x71v /root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/AnsiballZ_setup.py <<< 30582 1726855264.47330: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/AnsiballZ_setup.py" <<< 30582 1726855264.47346: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppgh5x71v" to remote "/root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/AnsiballZ_setup.py" <<< 30582 1726855264.50131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855264.50394: stderr chunk (state=3): >>><<< 30582 1726855264.50398: stdout chunk (state=3): >>><<< 30582 1726855264.50400: done transferring module to remote 30582 1726855264.50405: _low_level_execute_command(): starting 30582 1726855264.50407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/ /root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/AnsiballZ_setup.py && sleep 0' 30582 1726855264.51485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855264.51594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855264.51626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855264.51637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855264.51705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855264.51780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855264.53638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855264.53866: stderr chunk (state=3): >>><<< 30582 1726855264.53870: stdout chunk (state=3): >>><<< 30582 1726855264.53873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855264.53875: _low_level_execute_command(): starting 30582 1726855264.53878: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/AnsiballZ_setup.py && sleep 0' 30582 1726855264.55408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855264.55519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855264.55644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855264.55818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855264.58025: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30582 1726855264.58126: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 30582 1726855264.58151: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 30582 1726855264.58183: stdout chunk (state=3): >>>import 'posix' # <<< 30582 1726855264.58238: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 30582 1726855264.58251: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 30582 1726855264.58312: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855264.58343: stdout chunk (state=3): >>>import '_codecs' # <<< 30582 1726855264.58354: stdout chunk (state=3): >>>import 'codecs' # <<< 30582 1726855264.58426: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 30582 1726855264.58429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 30582 1726855264.58484: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d9684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d937b30> <<< 30582 1726855264.58494: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 30582 1726855264.58497: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d96aa50> <<< 30582 1726855264.58500: stdout chunk (state=3): >>>import '_signal' # <<< 30582 1726855264.58527: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 30582 1726855264.58542: stdout chunk (state=3): >>>import 'io' # <<< 30582 1726855264.58576: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 30582 1726855264.58657: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30582 1726855264.58689: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 30582 1726855264.58720: stdout chunk (state=3): >>>import 'os' # <<< 30582 1726855264.58745: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 30582 1726855264.58775: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 30582 1726855264.58820: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 30582 1726855264.58909: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d71d130> <<< 30582 1726855264.58933: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d71dfa0> <<< 30582 1726855264.58949: stdout chunk (state=3): >>>import 'site' # <<< 30582 1726855264.58969: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30582 1726855264.59351: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 30582 1726855264.59385: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 30582 1726855264.59415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 30582 1726855264.59471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 30582 1726855264.59543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 30582 1726855264.59550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 30582 1726855264.59554: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d75bda0> <<< 30582 1726855264.59605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 30582 1726855264.59624: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d75bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 30582 1726855264.59742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30582 1726855264.59773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d793770> <<< 30582 1726855264.59812: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d793e00> import '_collections' # <<< 30582 1726855264.59892: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d773a40> import '_functools' # <<< 30582 1726855264.59933: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d771190> <<< 30582 1726855264.60100: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d758f50> <<< 30582 1726855264.60105: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 30582 1726855264.60128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30582 1726855264.60153: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7b3710> <<< 30582 1726855264.60284: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7b2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 30582 1726855264.60293: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d772030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7b0b30> <<< 30582 1726855264.60296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7e87a0> <<< 30582 1726855264.60372: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7581d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 30582 1726855264.60414: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d7e8c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7e8b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d7e8ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d756cf0> <<< 30582 1726855264.60509: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 30582 1726855264.60536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7e95b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7e9280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7ea4b0> <<< 30582 1726855264.60569: stdout chunk (state=3): >>>import 'importlib.util' # <<< 30582 1726855264.60605: stdout chunk (state=3): >>>import 'runpy' # <<< 30582 1726855264.60612: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30582 1726855264.60650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30582 1726855264.60678: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d8006e0> <<< 30582 1726855264.60681: stdout chunk (state=3): >>>import 'errno' # <<< 30582 1726855264.60740: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d801df0> <<< 30582 1726855264.60877: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 30582 1726855264.60880: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d802c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d8032c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d8021b0> <<< 30582 1726855264.60883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 30582 1726855264.60909: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.60959: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d803d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d803470> <<< 30582 1726855264.61049: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7ea510> <<< 30582 1726855264.61068: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 30582 1726855264.61102: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d50bc50> <<< 30582 1726855264.61242: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d534710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d534470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.61246: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d534740> <<< 30582 1726855264.61259: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30582 1726855264.61301: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.61426: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d535040> <<< 30582 1726855264.61542: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d535a30> <<< 30582 1726855264.61579: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5348f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d509df0> <<< 30582 1726855264.61719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 30582 1726855264.61744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d536d80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d534ec0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7eac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30582 1726855264.61777: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855264.61801: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30582 1726855264.61834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30582 1726855264.61959: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d55f110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 30582 1726855264.61990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30582 1726855264.62016: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5834a0> <<< 30582 1726855264.62030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 30582 1726855264.62081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30582 1726855264.62125: stdout chunk (state=3): >>>import 'ntpath' # <<< 30582 1726855264.62160: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 30582 1726855264.62200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5e4260> <<< 30582 1726855264.62223: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 30582 1726855264.62276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30582 1726855264.62379: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5e69c0> <<< 30582 1726855264.62430: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5e4380> <<< 30582 1726855264.62516: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5b1280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf25370> <<< 30582 1726855264.62590: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5822a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d537ce0> <<< 30582 1726855264.62703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 30582 1726855264.62753: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f622d5828a0> <<< 30582 1726855264.62986: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_pc_gvpcq/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 30582 1726855264.63112: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.63192: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 30582 1726855264.63195: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 30582 1726855264.63496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf8b0e0> import '_typing' # <<< 30582 1726855264.63520: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf69fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf69130> # zipimport: zlib available <<< 30582 1726855264.63538: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 30582 1726855264.63562: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.63897: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 30582 1726855264.64944: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.66073: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 30582 1726855264.66091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf88f80> <<< 30582 1726855264.66099: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855264.66179: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 30582 1726855264.66186: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.66194: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.66212: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cfbaab0> <<< 30582 1726855264.66225: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cfba840> <<< 30582 1726855264.66280: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cfba180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 30582 1726855264.66316: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cfba5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d96a9c0> import 'atexit' # <<< 30582 1726855264.66347: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.66352: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cfbb800> <<< 30582 1726855264.66508: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.66517: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cfbb9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cfbbef0> import 'pwd' # <<< 30582 1726855264.66525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 30582 1726855264.66553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 30582 1726855264.66592: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce25ca0> <<< 30582 1726855264.66621: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce278c0> <<< 30582 1726855264.66715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce282c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30582 1726855264.66913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce29460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce2bf50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce302c0> <<< 30582 1726855264.66940: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce2a210> <<< 30582 1726855264.66943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 30582 1726855264.67099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 30582 1726855264.67102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 30582 1726855264.67273: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 30582 1726855264.67277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce33da0> import '_tokenize' # <<< 30582 1726855264.67347: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce328a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce32630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce32b40> <<< 30582 1726855264.67363: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce2a720> <<< 30582 1726855264.67605: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce77a10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce78140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce79be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce799a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 30582 1726855264.67628: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce7c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce7a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30582 1726855264.67654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855264.67711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 30582 1726855264.67735: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce7f8f0> <<< 30582 1726855264.68183: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce7c2c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce80680> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce80b00> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce80aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce782c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cd08230> <<< 30582 1726855264.68254: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.68258: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cd090d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce829c0> <<< 30582 1726855264.68327: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce83d70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce82600> <<< 30582 1726855264.68425: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.68431: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.68433: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 30582 1726855264.68436: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.68753: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.68811: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.69339: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.69869: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 30582 1726855264.69905: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 30582 1726855264.69996: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cd11370> <<< 30582 1726855264.70077: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd121e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd092e0> <<< 30582 1726855264.70248: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 30582 1726855264.70322: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.70496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd12900> # zipimport: zlib available <<< 30582 1726855264.70927: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.71365: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.71431: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.71505: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 30582 1726855264.71547: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.71583: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 30582 1726855264.71607: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.71786: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 30582 1726855264.71820: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.71858: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 30582 1726855264.71861: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.72303: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.72312: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30582 1726855264.72414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 30582 1726855264.72437: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd13470> <<< 30582 1726855264.72780: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 30582 1726855264.72792: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.72853: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.72860: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.72950: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30582 1726855264.72956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855264.73106: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cd1dee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd18d70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 30582 1726855264.73112: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.73233: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.73401: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30582 1726855264.73427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30582 1726855264.73470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 30582 1726855264.73513: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce06780> <<< 30582 1726855264.73561: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cefa450> <<< 30582 1726855264.73910: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd1df70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd141d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.73919: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.73925: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.73954: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.73996: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.74033: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.74067: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.74117: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 30582 1726855264.74213: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.74303: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.74324: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 30582 1726855264.74599: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.74659: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.74696: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.74766: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855264.74772: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 30582 1726855264.75073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb1af0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c92fd10> <<< 30582 1726855264.75077: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.75079: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c92ffe0> <<< 30582 1726855264.75116: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd9ade0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb2660> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb0200> <<< 30582 1726855264.75127: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb0bf0> <<< 30582 1726855264.75216: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 30582 1726855264.75238: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 30582 1726855264.75275: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.75286: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c94aff0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c94a8a0> <<< 30582 1726855264.75300: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.75391: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c94aa80> <<< 30582 1726855264.75501: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c949cd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c94b140> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 30582 1726855264.75555: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c9a1c10> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c94bbf0> <<< 30582 1726855264.75589: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb3e00> import 'ansible.module_utils.facts.timeout' # <<< 30582 1726855264.75618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 30582 1726855264.75642: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 30582 1726855264.75910: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 30582 1726855264.75914: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 30582 1726855264.75948: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.75976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 30582 1726855264.75983: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.76030: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.76081: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 30582 1726855264.76091: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.76132: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.76172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 30582 1726855264.76178: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.76575: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.76579: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 30582 1726855264.76882: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.77310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 30582 1726855264.77364: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.77558: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.77565: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 30582 1726855264.77568: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.77969: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.77978: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 30582 1726855264.77980: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.78043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c9a3d70> <<< 30582 1726855264.78119: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 30582 1726855264.78172: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c9a26c0> import 'ansible.module_utils.facts.system.local' # <<< 30582 1726855264.78178: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.78244: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.78435: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.78495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 30582 1726855264.78531: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.78562: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.78830: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 30582 1726855264.78837: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.78981: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c9d5ee0> <<< 30582 1726855264.79073: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c9a38f0> <<< 30582 1726855264.79079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 30582 1726855264.79102: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.79142: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.79359: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.79365: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.79478: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.79769: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.79801: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 30582 1726855264.79813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 30582 1726855264.79818: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855264.79860: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c9e9790> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c9c7ef0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 30582 1726855264.79965: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 30582 1726855264.79971: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.79974: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 30582 1726855264.79999: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.80310: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 30582 1726855264.80394: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.80655: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.80925: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 30582 1726855264.81034: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.81220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 30582 1726855264.81224: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.81226: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.81228: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.81763: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.82504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 30582 1726855264.82618: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.82677: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 30582 1726855264.82693: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.82949: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.83048: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 30582 1726855264.83071: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.83116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 30582 1726855264.83270: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.83341: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.83508: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.83703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 30582 1726855264.83715: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.83746: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.83800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 30582 1726855264.83812: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.83909: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 30582 1726855264.83932: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.83989: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 30582 1726855264.84009: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.84044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 30582 1726855264.84102: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.84167: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 30582 1726855264.84221: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.84270: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 30582 1726855264.84360: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.84582: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.84799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 30582 1726855264.84910: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.84938: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 30582 1726855264.84956: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.85002: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 30582 1726855264.85092: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 30582 1726855264.85095: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.85191: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 30582 1726855264.85195: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.85223: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.85572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855264.85637: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.85731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 30582 1726855264.85734: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.85783: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.85822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 30582 1726855264.85847: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.86086: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.86216: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 30582 1726855264.86227: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.86263: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.86317: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 30582 1726855264.86359: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.86415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 30582 1726855264.86494: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.86646: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 30582 1726855264.86650: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 30582 1726855264.86655: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.86677: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.86758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # <<< 30582 1726855264.86793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 30582 1726855264.86869: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855264.87962: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 30582 1726855264.87976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c7e6ea0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c7e7fb0> <<< 30582 1726855264.88014: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c7e5520> <<< 30582 1726855264.99722: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c82c260> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 30582 1726855264.99740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 30582 1726855264.99757: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c82d310> <<< 30582 1726855264.99836: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 30582 1726855264.99839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855264.99867: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 30582 1726855264.99946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c82fda0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c82f200> <<< 30582 1726855265.00244: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 30582 1726855265.26790: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "01", "second": "04", "epoch": "1726855264", "epoch_int": "1726855264", "date": "2024-09-20", "time": "14:01:04", "iso8601_micro": "2024-09-20T18:01:04.882401Z", "iso8601": "2024-09-20T18:01:04Z", "iso8601_basic": "20240920T140104882401", "iso8601_basic_short": "20240920T140104", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.5830078125, "5m": 0.6142578125, "15m": 0.3642578125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2993, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 538, "free": 2993}, "nocache": {"free": 3312, "used": 219}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_uuid": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1036, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798080512, "block_size": 4096, "block_total": 65519099, "block_available": 63915547, "block_used": 1603552, "inode_total": 131070960, "inode_available": 131029131, "inode_used": 41829, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "aa:60:c4:d8:31:87", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1088:11ff:feda:7fa3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.9.244"], "ansible_all_ipv6_addresses": ["fe80::1088:11ff:feda:7fa3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.244", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::1088:11ff:feda:7fa3"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30582 1726855265.27606: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 30582 1726855265.27969: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 30582 1726855265.28018: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 30582 1726855265.28051: stdout chunk (state=3): >>># destroy _bz2<<< 30582 1726855265.28073: stdout chunk (state=3): >>> # destroy _compression # destroy _lzma # destroy _blake2 <<< 30582 1726855265.28098: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 30582 1726855265.28130: stdout chunk (state=3): >>> # destroy zipfile # destroy pathlib<<< 30582 1726855265.28160: stdout chunk (state=3): >>> # destroy zipfile._path.glob # destroy ipaddress<<< 30582 1726855265.28173: stdout chunk (state=3): >>> <<< 30582 1726855265.28215: stdout chunk (state=3): >>># destroy ntpath <<< 30582 1726855265.28229: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 30582 1726855265.28261: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 30582 1726855265.28291: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner # destroy _json<<< 30582 1726855265.28316: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale <<< 30582 1726855265.28344: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal <<< 30582 1726855265.28364: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog<<< 30582 1726855265.28374: stdout chunk (state=3): >>> # destroy uuid <<< 30582 1726855265.28424: stdout chunk (state=3): >>># destroy selinux # destroy shutil<<< 30582 1726855265.28447: stdout chunk (state=3): >>> # destroy distro <<< 30582 1726855265.28468: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse # destroy logging <<< 30582 1726855265.28527: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 30582 1726855265.28564: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle<<< 30582 1726855265.28619: stdout chunk (state=3): >>> # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq<<< 30582 1726855265.28642: stdout chunk (state=3): >>> # destroy _queue # destroy multiprocessing.reduction # destroy selectors<<< 30582 1726855265.28682: stdout chunk (state=3): >>> # destroy shlex <<< 30582 1726855265.28714: stdout chunk (state=3): >>># destroy fcntl # destroy datetime <<< 30582 1726855265.28725: stdout chunk (state=3): >>># destroy subprocess # destroy base64<<< 30582 1726855265.28754: stdout chunk (state=3): >>> # destroy _ssl <<< 30582 1726855265.28791: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 30582 1726855265.28822: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json<<< 30582 1726855265.28865: stdout chunk (state=3): >>> # destroy socket # destroy struct<<< 30582 1726855265.28917: stdout chunk (state=3): >>> # destroy glob # destroy fnmatch<<< 30582 1726855265.28922: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector<<< 30582 1726855265.28965: stdout chunk (state=3): >>> # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 30582 1726855265.28970: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection <<< 30582 1726855265.29032: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep<<< 30582 1726855265.29036: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes<<< 30582 1726855265.29083: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader<<< 30582 1726855265.29103: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 30582 1726855265.29129: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 30582 1726855265.29158: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 30582 1726855265.29181: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 30582 1726855265.29210: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser<<< 30582 1726855265.29243: stdout chunk (state=3): >>> # cleanup[3] wiping _sre<<< 30582 1726855265.29264: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc<<< 30582 1726855265.29303: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 30582 1726855265.29336: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc<<< 30582 1726855265.29352: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 30582 1726855265.29376: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 30582 1726855265.29433: stdout chunk (state=3): >>> # destroy selinux._selinux<<< 30582 1726855265.29530: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30582 1726855265.29815: stdout chunk (state=3): >>># destroy sys.monitoring <<< 30582 1726855265.29868: stdout chunk (state=3): >>># destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse<<< 30582 1726855265.29903: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 30582 1726855265.29923: stdout chunk (state=3): >>> # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp<<< 30582 1726855265.29954: stdout chunk (state=3): >>> # destroy _io # destroy marshal # clear sys.meta_path<<< 30582 1726855265.29974: stdout chunk (state=3): >>> # clear sys.modules # destroy _frozen_importlib <<< 30582 1726855265.30067: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 30582 1726855265.30099: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time <<< 30582 1726855265.30223: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 30582 1726855265.30249: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 30582 1726855265.31129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855265.31132: stdout chunk (state=3): >>><<< 30582 1726855265.31134: stderr chunk (state=3): >>><<< 30582 1726855265.31326: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d9684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d937b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d96aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d71d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d71dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d75bda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d75bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d793770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d793e00> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d773a40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d771190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d758f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7b3710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7b2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d772030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7b0b30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7e87a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7581d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d7e8c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7e8b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d7e8ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d756cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7e95b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7e9280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7ea4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d8006e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d801df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d802c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d8032c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d8021b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d803d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d803470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7ea510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d50bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d534710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d534470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d534740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d535040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622d535a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5348f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d509df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d536d80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d534ec0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d7eac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d55f110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5834a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5e4260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5e69c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5e4380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5b1280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf25370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d5822a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d537ce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f622d5828a0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_pc_gvpcq/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf8b0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf69fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf69130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cf88f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cfbaab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cfba840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cfba180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cfba5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622d96a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cfbb800> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cfbb9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cfbbef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce25ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce278c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce282c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce29460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce2bf50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce302c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce2a210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce33da0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce328a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce32630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce32b40> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce2a720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce77a10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce78140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce79be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce799a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce7c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce7a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce7f8f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce7c2c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce80680> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce80b00> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce80aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce782c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cd08230> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cd090d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce829c0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622ce83d70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce82600> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cd11370> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd121e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd092e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd12900> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd13470> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622cd1dee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd18d70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622ce06780> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cefa450> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd1df70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd141d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb1af0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c92fd10> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c92ffe0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cd9ade0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb2660> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb0200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb0bf0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c94aff0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c94a8a0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c94aa80> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c949cd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c94b140> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c9a1c10> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c94bbf0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622cdb3e00> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c9a3d70> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c9a26c0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c9d5ee0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c9a38f0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c9e9790> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c9c7ef0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f622c7e6ea0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c7e7fb0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c7e5520> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c82c260> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c82d310> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c82fda0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f622c82f200> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "01", "second": "04", "epoch": "1726855264", "epoch_int": "1726855264", "date": "2024-09-20", "time": "14:01:04", "iso8601_micro": "2024-09-20T18:01:04.882401Z", "iso8601": "2024-09-20T18:01:04Z", "iso8601_basic": "20240920T140104882401", "iso8601_basic_short": "20240920T140104", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.5830078125, "5m": 0.6142578125, "15m": 0.3642578125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2993, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 538, "free": 2993}, "nocache": {"free": 3312, "used": 219}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_uuid": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1036, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798080512, "block_size": 4096, "block_total": 65519099, "block_available": 63915547, "block_used": 1603552, "inode_total": 131070960, "inode_available": 131029131, "inode_used": 41829, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "aa:60:c4:d8:31:87", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1088:11ff:feda:7fa3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.9.244"], "ansible_all_ipv6_addresses": ["fe80::1088:11ff:feda:7fa3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.244", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::1088:11ff:feda:7fa3"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 30582 1726855265.33415: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855265.33421: _low_level_execute_command(): starting 30582 1726855265.33423: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855263.7691743-30612-234758809968402/ > /dev/null 2>&1 && sleep 0' 30582 1726855265.34193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855265.34197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855265.34325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855265.34328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855265.34380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855265.34432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855265.34448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855265.34534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855265.36901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855265.37194: stdout chunk (state=3): >>><<< 30582 1726855265.37198: stderr chunk (state=3): >>><<< 30582 1726855265.37201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855265.37207: handler run complete 30582 1726855265.37210: variable 'ansible_facts' from source: unknown 30582 1726855265.37903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855265.38501: variable 'ansible_facts' from source: unknown 30582 1726855265.38642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855265.38893: attempt loop complete, returning result 30582 1726855265.38915: _execute() done 30582 1726855265.38922: dumping result to json 30582 1726855265.38975: done dumping result, returning 30582 1726855265.38989: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affcc66-ac2b-aa83-7d57-00000000001b] 30582 1726855265.39003: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000001b 30582 1726855265.40000: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000001b 30582 1726855265.40004: WORKER PROCESS EXITING ok: [managed_node3] 30582 1726855265.40660: no more pending results, returning what we have 30582 1726855265.40663: results queue empty 30582 1726855265.40663: checking for any_errors_fatal 30582 1726855265.40665: done checking for any_errors_fatal 30582 1726855265.40665: checking for max_fail_percentage 30582 1726855265.40667: done checking for max_fail_percentage 30582 1726855265.40668: checking to see if all hosts have failed and the running result is not ok 30582 1726855265.40668: done checking to see if all hosts have failed 30582 1726855265.40669: getting the remaining hosts for this loop 30582 1726855265.40671: done getting the remaining hosts for this loop 30582 1726855265.40675: getting the next task for host managed_node3 30582 1726855265.40681: done getting next task for host managed_node3 30582 1726855265.40683: ^ task is: TASK: meta (flush_handlers) 30582 1726855265.40685: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855265.40722: getting variables 30582 1726855265.40724: in VariableManager get_vars() 30582 1726855265.40745: Calling all_inventory to load vars for managed_node3 30582 1726855265.40748: Calling groups_inventory to load vars for managed_node3 30582 1726855265.40750: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855265.40759: Calling all_plugins_play to load vars for managed_node3 30582 1726855265.40761: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855265.40763: Calling groups_plugins_play to load vars for managed_node3 30582 1726855265.41007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855265.41219: done with get_vars() 30582 1726855265.41231: done getting variables 30582 1726855265.41293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 30582 1726855265.41351: in VariableManager get_vars() 30582 1726855265.41362: Calling all_inventory to load vars for managed_node3 30582 1726855265.41365: Calling groups_inventory to load vars for managed_node3 30582 1726855265.41367: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855265.41372: Calling all_plugins_play to load vars for managed_node3 30582 1726855265.41374: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855265.41377: Calling groups_plugins_play to load vars for managed_node3 30582 1726855265.41536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855265.41726: done with get_vars() 30582 1726855265.41741: done queuing things up, now waiting for results queue to drain 30582 1726855265.41743: results queue empty 30582 1726855265.41744: checking for any_errors_fatal 30582 1726855265.41746: done checking for any_errors_fatal 30582 1726855265.41751: checking for max_fail_percentage 30582 1726855265.41753: done checking for max_fail_percentage 30582 1726855265.41754: checking to see if all hosts have failed and the running result is not ok 30582 1726855265.41755: done checking to see if all hosts have failed 30582 1726855265.41755: getting the remaining hosts for this loop 30582 1726855265.41756: done getting the remaining hosts for this loop 30582 1726855265.41759: getting the next task for host managed_node3 30582 1726855265.41763: done getting next task for host managed_node3 30582 1726855265.41766: ^ task is: TASK: Include the task 'el_repo_setup.yml' 30582 1726855265.41767: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855265.41770: getting variables 30582 1726855265.41771: in VariableManager get_vars() 30582 1726855265.41779: Calling all_inventory to load vars for managed_node3 30582 1726855265.41781: Calling groups_inventory to load vars for managed_node3 30582 1726855265.41783: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855265.41790: Calling all_plugins_play to load vars for managed_node3 30582 1726855265.41792: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855265.41796: Calling groups_plugins_play to load vars for managed_node3 30582 1726855265.41935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855265.42126: done with get_vars() 30582 1726855265.42134: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:11 Friday 20 September 2024 14:01:05 -0400 (0:00:01.726) 0:00:01.772 ****** 30582 1726855265.42212: entering _queue_task() for managed_node3/include_tasks 30582 1726855265.42214: Creating lock for include_tasks 30582 1726855265.42527: worker is 1 (out of 1 available) 30582 1726855265.42539: exiting _queue_task() for managed_node3/include_tasks 30582 1726855265.42552: done queuing things up, now waiting for results queue to drain 30582 1726855265.42555: waiting for pending results... 30582 1726855265.42775: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 30582 1726855265.43262: in run() - task 0affcc66-ac2b-aa83-7d57-000000000006 30582 1726855265.43266: variable 'ansible_search_path' from source: unknown 30582 1726855265.43270: calling self._execute() 30582 1726855265.43272: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855265.43274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855265.43277: variable 'omit' from source: magic vars 30582 1726855265.43360: _execute() done 30582 1726855265.43388: dumping result to json 30582 1726855265.43401: done dumping result, returning 30582 1726855265.43415: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0affcc66-ac2b-aa83-7d57-000000000006] 30582 1726855265.43426: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000006 30582 1726855265.43633: no more pending results, returning what we have 30582 1726855265.43639: in VariableManager get_vars() 30582 1726855265.43674: Calling all_inventory to load vars for managed_node3 30582 1726855265.43677: Calling groups_inventory to load vars for managed_node3 30582 1726855265.43680: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855265.43697: Calling all_plugins_play to load vars for managed_node3 30582 1726855265.43700: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855265.43704: Calling groups_plugins_play to load vars for managed_node3 30582 1726855265.43922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855265.44295: done with get_vars() 30582 1726855265.44303: variable 'ansible_search_path' from source: unknown 30582 1726855265.44315: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000006 30582 1726855265.44318: WORKER PROCESS EXITING 30582 1726855265.44325: we have included files to process 30582 1726855265.44326: generating all_blocks data 30582 1726855265.44327: done generating all_blocks data 30582 1726855265.44328: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30582 1726855265.44329: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30582 1726855265.44331: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30582 1726855265.44972: in VariableManager get_vars() 30582 1726855265.44990: done with get_vars() 30582 1726855265.45002: done processing included file 30582 1726855265.45004: iterating over new_blocks loaded from include file 30582 1726855265.45005: in VariableManager get_vars() 30582 1726855265.45015: done with get_vars() 30582 1726855265.45016: filtering new block on tags 30582 1726855265.45030: done filtering new block on tags 30582 1726855265.45033: in VariableManager get_vars() 30582 1726855265.45042: done with get_vars() 30582 1726855265.45043: filtering new block on tags 30582 1726855265.45057: done filtering new block on tags 30582 1726855265.45060: in VariableManager get_vars() 30582 1726855265.45103: done with get_vars() 30582 1726855265.45105: filtering new block on tags 30582 1726855265.45119: done filtering new block on tags 30582 1726855265.45120: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 30582 1726855265.45126: extending task lists for all hosts with included blocks 30582 1726855265.45269: done extending task lists 30582 1726855265.45270: done processing included files 30582 1726855265.45271: results queue empty 30582 1726855265.45272: checking for any_errors_fatal 30582 1726855265.45273: done checking for any_errors_fatal 30582 1726855265.45274: checking for max_fail_percentage 30582 1726855265.45275: done checking for max_fail_percentage 30582 1726855265.45275: checking to see if all hosts have failed and the running result is not ok 30582 1726855265.45276: done checking to see if all hosts have failed 30582 1726855265.45277: getting the remaining hosts for this loop 30582 1726855265.45278: done getting the remaining hosts for this loop 30582 1726855265.45280: getting the next task for host managed_node3 30582 1726855265.45284: done getting next task for host managed_node3 30582 1726855265.45288: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 30582 1726855265.45291: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855265.45292: getting variables 30582 1726855265.45293: in VariableManager get_vars() 30582 1726855265.45301: Calling all_inventory to load vars for managed_node3 30582 1726855265.45303: Calling groups_inventory to load vars for managed_node3 30582 1726855265.45305: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855265.45310: Calling all_plugins_play to load vars for managed_node3 30582 1726855265.45312: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855265.45315: Calling groups_plugins_play to load vars for managed_node3 30582 1726855265.45458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855265.45698: done with get_vars() 30582 1726855265.45707: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 14:01:05 -0400 (0:00:00.035) 0:00:01.807 ****** 30582 1726855265.45783: entering _queue_task() for managed_node3/setup 30582 1726855265.46068: worker is 1 (out of 1 available) 30582 1726855265.46079: exiting _queue_task() for managed_node3/setup 30582 1726855265.46295: done queuing things up, now waiting for results queue to drain 30582 1726855265.46298: waiting for pending results... 30582 1726855265.46330: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 30582 1726855265.46433: in run() - task 0affcc66-ac2b-aa83-7d57-00000000002c 30582 1726855265.46450: variable 'ansible_search_path' from source: unknown 30582 1726855265.46456: variable 'ansible_search_path' from source: unknown 30582 1726855265.46496: calling self._execute() 30582 1726855265.46570: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855265.46580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855265.46596: variable 'omit' from source: magic vars 30582 1726855265.47100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855265.49096: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855265.49350: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855265.49393: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855265.49433: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855265.49469: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855265.49553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855265.49594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855265.49626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855265.49675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855265.49699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855265.49870: variable 'ansible_facts' from source: unknown 30582 1726855265.49947: variable 'network_test_required_facts' from source: task vars 30582 1726855265.49990: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 30582 1726855265.50005: variable 'omit' from source: magic vars 30582 1726855265.50091: variable 'omit' from source: magic vars 30582 1726855265.50097: variable 'omit' from source: magic vars 30582 1726855265.50293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855265.50297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855265.50299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855265.50302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855265.50304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855265.50306: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855265.50308: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855265.50310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855265.50355: Set connection var ansible_timeout to 10 30582 1726855265.50363: Set connection var ansible_connection to ssh 30582 1726855265.50376: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855265.50385: Set connection var ansible_pipelining to False 30582 1726855265.50399: Set connection var ansible_shell_executable to /bin/sh 30582 1726855265.50406: Set connection var ansible_shell_type to sh 30582 1726855265.50436: variable 'ansible_shell_executable' from source: unknown 30582 1726855265.50445: variable 'ansible_connection' from source: unknown 30582 1726855265.50452: variable 'ansible_module_compression' from source: unknown 30582 1726855265.50459: variable 'ansible_shell_type' from source: unknown 30582 1726855265.50466: variable 'ansible_shell_executable' from source: unknown 30582 1726855265.50474: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855265.50482: variable 'ansible_pipelining' from source: unknown 30582 1726855265.50491: variable 'ansible_timeout' from source: unknown 30582 1726855265.50500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855265.50650: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855265.50665: variable 'omit' from source: magic vars 30582 1726855265.50675: starting attempt loop 30582 1726855265.50682: running the handler 30582 1726855265.50751: _low_level_execute_command(): starting 30582 1726855265.50754: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855265.51515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855265.51531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855265.51548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855265.51657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855265.54002: stdout chunk (state=3): >>>/root <<< 30582 1726855265.54162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855265.54246: stderr chunk (state=3): >>><<< 30582 1726855265.54257: stdout chunk (state=3): >>><<< 30582 1726855265.54301: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855265.54428: _low_level_execute_command(): starting 30582 1726855265.54433: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952 `" && echo ansible-tmp-1726855265.5432231-30681-201194489615952="` echo /root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952 `" ) && sleep 0' 30582 1726855265.55100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855265.55115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855265.55128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855265.55232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855265.55267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855265.55320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855265.55411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855265.58094: stdout chunk (state=3): >>>ansible-tmp-1726855265.5432231-30681-201194489615952=/root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952 <<< 30582 1726855265.58402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855265.58406: stdout chunk (state=3): >>><<< 30582 1726855265.58409: stderr chunk (state=3): >>><<< 30582 1726855265.58411: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855265.5432231-30681-201194489615952=/root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855265.58413: variable 'ansible_module_compression' from source: unknown 30582 1726855265.58514: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30582 1726855265.58649: variable 'ansible_facts' from source: unknown 30582 1726855265.59054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/AnsiballZ_setup.py 30582 1726855265.59277: Sending initial data 30582 1726855265.59672: Sent initial data (154 bytes) 30582 1726855265.60263: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855265.60277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855265.60294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855265.60401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855265.60413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855265.60431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855265.60524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855265.62310: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855265.62339: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855265.62392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855265.62483: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpi4uzti7b /root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/AnsiballZ_setup.py <<< 30582 1726855265.62507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/AnsiballZ_setup.py" <<< 30582 1726855265.62594: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpi4uzti7b" to remote "/root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/AnsiballZ_setup.py" <<< 30582 1726855265.64818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855265.65102: stderr chunk (state=3): >>><<< 30582 1726855265.65106: stdout chunk (state=3): >>><<< 30582 1726855265.65109: done transferring module to remote 30582 1726855265.65111: _low_level_execute_command(): starting 30582 1726855265.65113: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/ /root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/AnsiballZ_setup.py && sleep 0' 30582 1726855265.66079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855265.66083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855265.66085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855265.66259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855265.66283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855265.66403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855265.66431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855265.68514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855265.68518: stdout chunk (state=3): >>><<< 30582 1726855265.68520: stderr chunk (state=3): >>><<< 30582 1726855265.68522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855265.68524: _low_level_execute_command(): starting 30582 1726855265.68526: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/AnsiballZ_setup.py && sleep 0' 30582 1726855265.69718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855265.69732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855265.69742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855265.69884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855265.69906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855265.70174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855265.72156: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30582 1726855265.72183: stdout chunk (state=3): >>>import _imp # builtin <<< 30582 1726855265.72217: stdout chunk (state=3): >>>import '_thread' # <<< 30582 1726855265.72342: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 30582 1726855265.72349: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 30582 1726855265.72362: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 30582 1726855265.72437: stdout chunk (state=3): >>># installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 30582 1726855265.72453: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855265.72546: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 30582 1726855265.72623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc6184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc5e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc61aa50> <<< 30582 1726855265.72647: stdout chunk (state=3): >>>import '_signal' # <<< 30582 1726855265.72735: stdout chunk (state=3): >>>import '_abc' # <<< 30582 1726855265.72770: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 30582 1726855265.72811: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30582 1726855265.73059: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 30582 1726855265.73067: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc3c9130> <<< 30582 1726855265.73092: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855265.73108: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc3c9fa0> import 'site' # <<< 30582 1726855265.73128: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30582 1726855265.73693: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 30582 1726855265.73713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc407e30> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 30582 1726855265.73795: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc407ef0> <<< 30582 1726855265.73841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30582 1726855265.74010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc43f860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc43fef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc41fb00> <<< 30582 1726855265.74023: stdout chunk (state=3): >>>import '_functools' # <<< 30582 1726855265.74063: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc41d220> <<< 30582 1726855265.74192: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc404fe0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 30582 1726855265.74196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 30582 1726855265.74297: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30582 1726855265.74336: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc45f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc45e420> <<< 30582 1726855265.74395: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc41e0f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc45cc80> <<< 30582 1726855265.74442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc494830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc404260> <<< 30582 1726855265.74613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 30582 1726855265.74762: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc494ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc494b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc494f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc402d80> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc495670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc495340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc496570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 30582 1726855265.74809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 30582 1726855265.74816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4ac770> <<< 30582 1726855265.75001: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc4ade50> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4aecf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855265.75004: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc4af320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4ae240> <<< 30582 1726855265.75022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 30582 1726855265.75041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 30582 1726855265.75104: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc4afda0> <<< 30582 1726855265.75107: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4af4d0> <<< 30582 1726855265.75411: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4964e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1a7cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1d0710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d0470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1d0740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 30582 1726855265.75469: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855265.75591: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1d1070> <<< 30582 1726855265.75717: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1d1a60> <<< 30582 1726855265.75740: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d0920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1a5e50> <<< 30582 1726855265.75838: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 30582 1726855265.75861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d2e40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d1b80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc496c90> <<< 30582 1726855265.75880: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30582 1726855265.75949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855265.75967: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30582 1726855265.76005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30582 1726855265.76024: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1fb1d0> <<< 30582 1726855265.76093: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855265.76127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 30582 1726855265.76163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30582 1726855265.76271: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc21f530> <<< 30582 1726855265.76291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30582 1726855265.76313: stdout chunk (state=3): >>>import 'ntpath' # <<< 30582 1726855265.76343: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc280320> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 30582 1726855265.76382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 30582 1726855265.76408: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 30582 1726855265.76503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30582 1726855265.76531: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc282a80> <<< 30582 1726855265.76717: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc280440> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc245340> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb29430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc21e330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d3da0> <<< 30582 1726855265.76867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 30582 1726855265.76893: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f31dbb296d0> <<< 30582 1726855265.77235: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_2tat81g7/ansible_setup_payload.zip' # zipimport: zlib available <<< 30582 1726855265.77356: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.77392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 30582 1726855265.77439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 30582 1726855265.77510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 30582 1726855265.77544: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb93140> <<< 30582 1726855265.77555: stdout chunk (state=3): >>>import '_typing' # <<< 30582 1726855265.77735: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb72030> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb711c0> <<< 30582 1726855265.78027: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.78032: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 30582 1726855265.79235: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.80346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb91430> <<< 30582 1726855265.80382: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855265.80430: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 30582 1726855265.80455: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 30582 1726855265.80467: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dbbc2b40> <<< 30582 1726855265.80507: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbc28d0> <<< 30582 1726855265.80616: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbc21e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 30582 1726855265.80641: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbc2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb93dd0> import 'atexit' # <<< 30582 1726855265.80683: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dbbc3830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dbbc39b0> <<< 30582 1726855265.80711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 30582 1726855265.80744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 30582 1726855265.80766: stdout chunk (state=3): >>>import '_locale' # <<< 30582 1726855265.80828: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbc3e60> import 'pwd' # <<< 30582 1726855265.80837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 30582 1726855265.80871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 30582 1726855265.80904: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba2dbb0> <<< 30582 1726855265.80969: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba2f830> <<< 30582 1726855265.80972: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 30582 1726855265.80974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 30582 1726855265.81049: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba30230> <<< 30582 1726855265.81052: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30582 1726855265.81070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba313a0> <<< 30582 1726855265.81239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 30582 1726855265.81286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 30582 1726855265.81318: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba33e60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc402e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba32180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 30582 1726855265.81340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 30582 1726855265.81361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 30582 1726855265.81479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 30582 1726855265.81623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba3bdd0> <<< 30582 1726855265.81626: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba3a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba3a600> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 30582 1726855265.81690: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba3ab70> <<< 30582 1726855265.82098: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba325d0> <<< 30582 1726855265.82104: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba7fe90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba80560> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba81b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba81940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba83fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba82210> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30582 1726855265.82197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba877a0> <<< 30582 1726855265.82231: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba84170> <<< 30582 1726855265.82297: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba88860> <<< 30582 1726855265.82328: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba887d0> <<< 30582 1726855265.82372: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba88920> <<< 30582 1726855265.82414: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba80290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 30582 1726855265.82472: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855265.82562: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba8bfb0> <<< 30582 1726855265.82650: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855265.82679: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db9153d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba8a750> <<< 30582 1726855265.82797: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba8bad0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba8a360> <<< 30582 1726855265.82802: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 30582 1726855265.82840: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.82932: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.83002: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 30582 1726855265.83043: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 30582 1726855265.83133: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.83247: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.83769: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.84319: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 30582 1726855265.84334: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 30582 1726855265.84363: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855265.84419: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855265.84434: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db919490> <<< 30582 1726855265.84518: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 30582 1726855265.84531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db91a1e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db915430> <<< 30582 1726855265.84590: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 30582 1726855265.84624: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.84647: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 30582 1726855265.84788: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.84958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 30582 1726855265.84983: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db91a210> <<< 30582 1726855265.84986: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.85593: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.85902: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.85945: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.86009: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 30582 1726855265.86061: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855265.86112: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 30582 1726855265.86165: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.86261: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 30582 1726855265.86366: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 30582 1726855265.86407: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 30582 1726855265.86615: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.86839: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30582 1726855265.86939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 30582 1726855265.87157: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db91b3e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 30582 1726855265.87160: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 30582 1726855265.87172: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87217: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87262: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 30582 1726855265.87265: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87310: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87354: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87412: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87479: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30582 1726855265.87509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855265.87608: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db925e80> <<< 30582 1726855265.87660: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db920d40> <<< 30582 1726855265.87681: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 30582 1726855265.87693: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87737: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87820: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87849: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.87877: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855265.87902: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 30582 1726855265.87928: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 30582 1726855265.87945: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30582 1726855265.88015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30582 1726855265.88041: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 30582 1726855265.88086: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba0e630> <<< 30582 1726855265.88137: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbee300> <<< 30582 1726855265.88220: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db925bb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba88d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 30582 1726855265.88271: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88274: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88290: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 30582 1726855265.88356: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 30582 1726855265.88381: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 30582 1726855265.88450: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88527: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88539: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88550: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88595: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88630: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88717: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88827: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 30582 1726855265.88889: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.88939: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 30582 1726855265.89134: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.89276: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.89362: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.89395: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855265.89465: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 30582 1726855265.89504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b5cd0> <<< 30582 1726855265.89605: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 30582 1726855265.89706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5cfbf0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db5cfec0> <<< 30582 1726855265.89734: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b7380> <<< 30582 1726855265.89847: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b6840> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b43b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b7ec0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 30582 1726855265.90131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db5daf00> <<< 30582 1726855265.90138: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5da7b0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db5da990> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5d9be0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5db020> <<< 30582 1726855265.90201: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 30582 1726855265.90292: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db639b20> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5dbb00> <<< 30582 1726855265.90295: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b4110> import 'ansible.module_utils.facts.timeout' # <<< 30582 1726855265.90297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 30582 1726855265.90332: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 30582 1726855265.90366: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.90424: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 30582 1726855265.90438: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.90574: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.90611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855265.90634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 30582 1726855265.90648: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.90694: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.90737: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 30582 1726855265.90791: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.90836: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 30582 1726855265.90913: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.90950: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.91014: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.91111: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 30582 1726855265.91586: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.91995: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 30582 1726855265.92001: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.92020: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.92066: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.92104: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.92179: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 30582 1726855265.92262: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855265.92291: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855265.92322: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 30582 1726855265.92343: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.92367: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.92412: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 30582 1726855265.92457: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.92504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 30582 1726855265.92553: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.92927: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db63bc50> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db63a7b0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 30582 1726855265.92980: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.93092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 30582 1726855265.93295: stdout chunk (state=3): >>># zipimport: zlib available<<< 30582 1726855265.93301: stdout chunk (state=3): >>> <<< 30582 1726855265.93359: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 30582 1726855265.93408: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.93500: stdout chunk (state=3): >>># zipimport: zlib available<<< 30582 1726855265.93567: stdout chunk (state=3): >>> <<< 30582 1726855265.93620: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 30582 1726855265.93648: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.93714: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.93794: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 30582 1726855265.93859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc'<<< 30582 1726855265.93956: stdout chunk (state=3): >>> # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855265.94058: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855265.94074: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db681f70> <<< 30582 1726855265.94369: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db662de0> <<< 30582 1726855265.94407: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 30582 1726855265.94426: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.94512: stdout chunk (state=3): >>># zipimport: zlib available<<< 30582 1726855265.94584: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.selinux' # <<< 30582 1726855265.94617: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.94738: stdout chunk (state=3): >>># zipimport: zlib available<<< 30582 1726855265.94760: stdout chunk (state=3): >>> <<< 30582 1726855265.94881: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.95052: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.95260: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 30582 1726855265.95316: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 30582 1726855265.95335: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.95393: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 30582 1726855265.95408: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.95694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 30582 1726855265.95717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855265.95830: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db6859d0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db681d60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 30582 1726855265.95978: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.96106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 30582 1726855265.96133: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.96229: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.96279: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.96318: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 30582 1726855265.96333: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.96349: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.96374: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.96568: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.96683: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 30582 1726855265.96917: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.96960: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855265.97003: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.97612: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.98482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 30582 1726855265.98591: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.98747: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 30582 1726855265.98895: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.99033: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 30582 1726855265.99050: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.99275: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.99526: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 30582 1726855265.99551: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 30582 1726855265.99617: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855265.99673: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 30582 1726855265.99838: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00045: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00246: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00444: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 30582 1726855266.00467: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00502: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 30582 1726855266.00543: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00566: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00581: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 30582 1726855266.00615: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00675: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 30582 1726855266.00765: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00784: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 30582 1726855266.00800: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.00851: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.01276: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 30582 1726855266.01511: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.01920: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 30582 1726855266.01926: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02003: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02083: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 30582 1726855266.02091: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02140: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02185: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 30582 1726855266.02192: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02271: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02274: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 30582 1726855266.02322: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02372: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 30582 1726855266.02662: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 30582 1726855266.02706: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02769: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 30582 1726855266.02804: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855266.02850: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02897: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.02955: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.03049: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.03163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 30582 1726855266.03166: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 30582 1726855266.03185: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.03292: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # <<< 30582 1726855266.03325: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.03633: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.03927: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 30582 1726855266.04044: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # <<< 30582 1726855266.04075: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.04123: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.04189: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 30582 1726855266.04308: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.04423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 30582 1726855266.04558: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30582 1726855266.04697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 30582 1726855266.04892: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.05470: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 30582 1726855266.05515: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 30582 1726855266.05521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 30582 1726855266.05565: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db483320> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db480b90> <<< 30582 1726855266.05628: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db480f20> <<< 30582 1726855266.06722: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "01", "second": "06", "epoch": "1726855266", "epoch_int": "1726855266", "date": "2024-09-20", "time": "14:01:06", "iso8601_micro": "2024-09-20T18:01:06.049377Z", "iso8601": "2024-09-20T18:01:06Z", "iso8601_basic": "20240920T140106049377", "iso8601_basic_short": "20240920T140106", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa"<<< 30582 1726855266.06735: stdout chunk (state=3): >>>, "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30582 1726855266.07626: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanu<<< 30582 1726855266.07655: stdout chunk (state=3): >>>p[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 30582 1726855266.07674: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue <<< 30582 1726855266.07707: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time <<< 30582 1726855266.07964: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly <<< 30582 1726855266.07992: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 30582 1726855266.08189: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 30582 1726855266.08195: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 30582 1726855266.08237: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 30582 1726855266.08247: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 30582 1726855266.08269: stdout chunk (state=3): >>># destroy zipfile._path <<< 30582 1726855266.08279: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 30582 1726855266.08289: stdout chunk (state=3): >>># destroy ipaddress <<< 30582 1726855266.08311: stdout chunk (state=3): >>># destroy ntpath <<< 30582 1726855266.08348: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 30582 1726855266.08352: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 30582 1726855266.08363: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 30582 1726855266.08396: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal <<< 30582 1726855266.08403: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 30582 1726855266.08440: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 30582 1726855266.08468: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 30582 1726855266.08518: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 30582 1726855266.08527: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 30582 1726855266.08548: stdout chunk (state=3): >>># destroy _pickle <<< 30582 1726855266.08560: stdout chunk (state=3): >>># destroy queue <<< 30582 1726855266.08576: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 30582 1726855266.08589: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 30582 1726855266.08607: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 30582 1726855266.08613: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 30582 1726855266.08652: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux <<< 30582 1726855266.08658: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios <<< 30582 1726855266.08691: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct <<< 30582 1726855266.08708: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 30582 1726855266.08752: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 30582 1726855266.08783: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 30582 1726855266.08810: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 30582 1726855266.08820: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants <<< 30582 1726855266.08832: stdout chunk (state=3): >>># destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 30582 1726855266.08865: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 30582 1726855266.08967: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 30582 1726855266.08973: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30582 1726855266.09067: stdout chunk (state=3): >>># destroy sys.monitoring <<< 30582 1726855266.09074: stdout chunk (state=3): >>># destroy _socket <<< 30582 1726855266.09093: stdout chunk (state=3): >>># destroy _collections <<< 30582 1726855266.09124: stdout chunk (state=3): >>># destroy platform <<< 30582 1726855266.09128: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 30582 1726855266.09157: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 30582 1726855266.09166: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 30582 1726855266.09181: stdout chunk (state=3): >>># destroy _typing <<< 30582 1726855266.09201: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 30582 1726855266.09241: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 30582 1726855266.09361: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 30582 1726855266.09368: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 30582 1726855266.09391: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect <<< 30582 1726855266.09397: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 30582 1726855266.09422: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre <<< 30582 1726855266.09427: stdout chunk (state=3): >>># destroy _string # destroy re <<< 30582 1726855266.09452: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix <<< 30582 1726855266.09461: stdout chunk (state=3): >>># destroy _functools # destroy builtins # destroy _thread <<< 30582 1726855266.09563: stdout chunk (state=3): >>># clear sys.audit hooks <<< 30582 1726855266.09980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855266.10021: stderr chunk (state=3): >>><<< 30582 1726855266.10024: stdout chunk (state=3): >>><<< 30582 1726855266.10131: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc6184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc5e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc61aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc3c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc3c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc407e30> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc407ef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc43f860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc43fef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc41fb00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc41d220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc404fe0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc45f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc45e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc41e0f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc45cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc494830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc404260> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc494ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc494b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc494f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc402d80> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc495670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc495340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc496570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4ac770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc4ade50> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4aecf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc4af320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4ae240> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc4afda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4af4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc4964e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1a7cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1d0710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d0470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1d0740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1d1070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc1d1a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d0920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1a5e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d2e40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d1b80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc496c90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1fb1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc21f530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc280320> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc282a80> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc280440> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc245340> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb29430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc21e330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dc1d3da0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f31dbb296d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_2tat81g7/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb93140> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb72030> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb711c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb91430> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dbbc2b40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbc28d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbc21e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbc2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbb93dd0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dbbc3830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dbbc39b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbc3e60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba2dbb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba2f830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba30230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba313a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba33e60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dc402e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba32180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba3bdd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba3a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba3a600> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba3ab70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba325d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba7fe90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba80560> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba81b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba81940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba83fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba82210> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba877a0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba84170> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba88860> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba887d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba88920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba80290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba8bfb0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db9153d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba8a750> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31dba8bad0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba8a360> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db919490> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db91a1e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db915430> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db91a210> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db91b3e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db925e80> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db920d40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba0e630> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dbbee300> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db925bb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31dba88d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b5cd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5cfbf0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db5cfec0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b7380> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b6840> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b43b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b7ec0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db5daf00> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5da7b0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db5da990> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5d9be0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5db020> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db639b20> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db5dbb00> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db9b4110> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db63bc50> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db63a7b0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db681f70> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db662de0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db6859d0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db681d60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31db483320> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db480b90> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31db480f20> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "01", "second": "06", "epoch": "1726855266", "epoch_int": "1726855266", "date": "2024-09-20", "time": "14:01:06", "iso8601_micro": "2024-09-20T18:01:06.049377Z", "iso8601": "2024-09-20T18:01:06Z", "iso8601_basic": "20240920T140106049377", "iso8601_basic_short": "20240920T140106", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 30582 1726855266.10950: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855266.10953: _low_level_execute_command(): starting 30582 1726855266.10956: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855265.5432231-30681-201194489615952/ > /dev/null 2>&1 && sleep 0' 30582 1726855266.10982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855266.10986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855266.10991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855266.11014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855266.11016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855266.11058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855266.11068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855266.11146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855266.13801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855266.13834: stderr chunk (state=3): >>><<< 30582 1726855266.13837: stdout chunk (state=3): >>><<< 30582 1726855266.13855: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855266.13861: handler run complete 30582 1726855266.13891: variable 'ansible_facts' from source: unknown 30582 1726855266.13932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855266.14007: variable 'ansible_facts' from source: unknown 30582 1726855266.14055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855266.14091: attempt loop complete, returning result 30582 1726855266.14095: _execute() done 30582 1726855266.14097: dumping result to json 30582 1726855266.14107: done dumping result, returning 30582 1726855266.14114: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcc66-ac2b-aa83-7d57-00000000002c] 30582 1726855266.14118: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000002c 30582 1726855266.14246: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000002c 30582 1726855266.14249: WORKER PROCESS EXITING ok: [managed_node3] 30582 1726855266.14349: no more pending results, returning what we have 30582 1726855266.14352: results queue empty 30582 1726855266.14353: checking for any_errors_fatal 30582 1726855266.14354: done checking for any_errors_fatal 30582 1726855266.14355: checking for max_fail_percentage 30582 1726855266.14356: done checking for max_fail_percentage 30582 1726855266.14363: checking to see if all hosts have failed and the running result is not ok 30582 1726855266.14364: done checking to see if all hosts have failed 30582 1726855266.14365: getting the remaining hosts for this loop 30582 1726855266.14366: done getting the remaining hosts for this loop 30582 1726855266.14370: getting the next task for host managed_node3 30582 1726855266.14377: done getting next task for host managed_node3 30582 1726855266.14379: ^ task is: TASK: Check if system is ostree 30582 1726855266.14382: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855266.14385: getting variables 30582 1726855266.14386: in VariableManager get_vars() 30582 1726855266.14415: Calling all_inventory to load vars for managed_node3 30582 1726855266.14418: Calling groups_inventory to load vars for managed_node3 30582 1726855266.14421: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855266.14430: Calling all_plugins_play to load vars for managed_node3 30582 1726855266.14433: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855266.14435: Calling groups_plugins_play to load vars for managed_node3 30582 1726855266.14582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855266.14714: done with get_vars() 30582 1726855266.14723: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 14:01:06 -0400 (0:00:00.690) 0:00:02.497 ****** 30582 1726855266.14785: entering _queue_task() for managed_node3/stat 30582 1726855266.15303: worker is 1 (out of 1 available) 30582 1726855266.15310: exiting _queue_task() for managed_node3/stat 30582 1726855266.15321: done queuing things up, now waiting for results queue to drain 30582 1726855266.15323: waiting for pending results... 30582 1726855266.15451: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 30582 1726855266.15456: in run() - task 0affcc66-ac2b-aa83-7d57-00000000002e 30582 1726855266.15458: variable 'ansible_search_path' from source: unknown 30582 1726855266.15461: variable 'ansible_search_path' from source: unknown 30582 1726855266.15467: calling self._execute() 30582 1726855266.15548: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855266.15559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855266.15571: variable 'omit' from source: magic vars 30582 1726855266.16057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855266.16254: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855266.16298: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855266.16327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855266.16355: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855266.16426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855266.16444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855266.16462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855266.16482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855266.16576: Evaluated conditional (not __network_is_ostree is defined): True 30582 1726855266.16579: variable 'omit' from source: magic vars 30582 1726855266.16612: variable 'omit' from source: magic vars 30582 1726855266.16638: variable 'omit' from source: magic vars 30582 1726855266.16656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855266.16678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855266.16695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855266.16713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855266.16721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855266.16744: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855266.16747: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855266.16750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855266.16825: Set connection var ansible_timeout to 10 30582 1726855266.16829: Set connection var ansible_connection to ssh 30582 1726855266.16835: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855266.16839: Set connection var ansible_pipelining to False 30582 1726855266.16844: Set connection var ansible_shell_executable to /bin/sh 30582 1726855266.16846: Set connection var ansible_shell_type to sh 30582 1726855266.16863: variable 'ansible_shell_executable' from source: unknown 30582 1726855266.16865: variable 'ansible_connection' from source: unknown 30582 1726855266.16868: variable 'ansible_module_compression' from source: unknown 30582 1726855266.16870: variable 'ansible_shell_type' from source: unknown 30582 1726855266.16873: variable 'ansible_shell_executable' from source: unknown 30582 1726855266.16875: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855266.16878: variable 'ansible_pipelining' from source: unknown 30582 1726855266.16881: variable 'ansible_timeout' from source: unknown 30582 1726855266.16885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855266.16988: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855266.16997: variable 'omit' from source: magic vars 30582 1726855266.17004: starting attempt loop 30582 1726855266.17007: running the handler 30582 1726855266.17020: _low_level_execute_command(): starting 30582 1726855266.17026: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855266.17508: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855266.17527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855266.17583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855266.17586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855266.17591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855266.17663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855266.19942: stdout chunk (state=3): >>>/root <<< 30582 1726855266.20130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855266.20133: stdout chunk (state=3): >>><<< 30582 1726855266.20135: stderr chunk (state=3): >>><<< 30582 1726855266.20156: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855266.20264: _low_level_execute_command(): starting 30582 1726855266.20268: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767 `" && echo ansible-tmp-1726855266.2017062-30720-229252172967767="` echo /root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767 `" ) && sleep 0' 30582 1726855266.20826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855266.20842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855266.20859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855266.20879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855266.20909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855266.20921: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855266.21006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855266.21029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855266.21053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855266.21074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855266.21177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855266.23912: stdout chunk (state=3): >>>ansible-tmp-1726855266.2017062-30720-229252172967767=/root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767 <<< 30582 1726855266.24196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855266.24397: stdout chunk (state=3): >>><<< 30582 1726855266.24400: stderr chunk (state=3): >>><<< 30582 1726855266.24404: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855266.2017062-30720-229252172967767=/root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855266.24406: variable 'ansible_module_compression' from source: unknown 30582 1726855266.24409: ANSIBALLZ: Using lock for stat 30582 1726855266.24411: ANSIBALLZ: Acquiring lock 30582 1726855266.24413: ANSIBALLZ: Lock acquired: 140270807060976 30582 1726855266.24415: ANSIBALLZ: Creating module 30582 1726855266.42226: ANSIBALLZ: Writing module into payload 30582 1726855266.42602: ANSIBALLZ: Writing module 30582 1726855266.42694: ANSIBALLZ: Renaming module 30582 1726855266.42700: ANSIBALLZ: Done creating module 30582 1726855266.42703: variable 'ansible_facts' from source: unknown 30582 1726855266.42743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/AnsiballZ_stat.py 30582 1726855266.42989: Sending initial data 30582 1726855266.43002: Sent initial data (153 bytes) 30582 1726855266.44364: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855266.44471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855266.44610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855266.44701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855266.46311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855266.46415: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855266.46469: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp0doragao /root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/AnsiballZ_stat.py <<< 30582 1726855266.46483: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/AnsiballZ_stat.py" <<< 30582 1726855266.46607: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp0doragao" to remote "/root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/AnsiballZ_stat.py" <<< 30582 1726855266.48035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855266.48294: stderr chunk (state=3): >>><<< 30582 1726855266.48297: stdout chunk (state=3): >>><<< 30582 1726855266.48301: done transferring module to remote 30582 1726855266.48307: _low_level_execute_command(): starting 30582 1726855266.48309: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/ /root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/AnsiballZ_stat.py && sleep 0' 30582 1726855266.49660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855266.50005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855266.50022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855266.50109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855266.51923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855266.51979: stderr chunk (state=3): >>><<< 30582 1726855266.52297: stdout chunk (state=3): >>><<< 30582 1726855266.52302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855266.52305: _low_level_execute_command(): starting 30582 1726855266.52308: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/AnsiballZ_stat.py && sleep 0' 30582 1726855266.53352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855266.53356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855266.53359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855266.53361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855266.53363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855266.53373: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855266.53478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855266.53613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855266.55902: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 30582 1726855266.55918: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 30582 1726855266.55946: stdout chunk (state=3): >>>import 'posix' # <<< 30582 1726855266.55994: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 30582 1726855266.56049: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 30582 1726855266.56067: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 30582 1726855266.56081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.56100: stdout chunk (state=3): >>>import '_codecs' # <<< 30582 1726855266.56112: stdout chunk (state=3): >>>import 'codecs' # <<< 30582 1726855266.56313: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56dbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56d8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56dbea50> import '_signal' # import '_abc' # import 'abc' # <<< 30582 1726855266.56316: stdout chunk (state=3): >>>import 'io' # <<< 30582 1726855266.56346: stdout chunk (state=3): >>>import '_stat' # <<< 30582 1726855266.56349: stdout chunk (state=3): >>>import 'stat' # <<< 30582 1726855266.56438: stdout chunk (state=3): >>>import '_collections_abc' # <<< 30582 1726855266.56457: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 30582 1726855266.56553: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 30582 1726855266.56639: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56dcd130> <<< 30582 1726855266.56665: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 30582 1726855266.56680: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.56769: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56dcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30582 1726855266.56983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 30582 1726855266.57012: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 30582 1726855266.57028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.57110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 30582 1726855266.57114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 30582 1726855266.57311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56babe30> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56babef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 30582 1726855266.57317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 30582 1726855266.57330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.57438: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56be3860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56be3ef0> <<< 30582 1726855266.57441: stdout chunk (state=3): >>>import '_collections' # <<< 30582 1726855266.57498: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56bc3b00> <<< 30582 1726855266.57537: stdout chunk (state=3): >>>import '_functools' # <<< 30582 1726855266.57610: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56bc1220> <<< 30582 1726855266.57622: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56ba8fe0> <<< 30582 1726855266.57651: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 30582 1726855266.57758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 30582 1726855266.57795: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c037d0> <<< 30582 1726855266.57976: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c023f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56bc20f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56baa8a0> <<< 30582 1726855266.57980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c38830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56ba8260> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c38ce0> <<< 30582 1726855266.58074: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c38b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c38f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56ba6d80> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.58090: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 30582 1726855266.58123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 30582 1726855266.58137: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c395b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c392b0> import 'importlib.machinery' # <<< 30582 1726855266.58181: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 30582 1726855266.58204: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c3a450> <<< 30582 1726855266.58312: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c50650> <<< 30582 1726855266.58406: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c51d30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 30582 1726855266.58435: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 30582 1726855266.58438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 30582 1726855266.58632: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c52bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c53230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c52120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 30582 1726855266.58658: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c53cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c533e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c3a3c0> <<< 30582 1726855266.58661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 30582 1726855266.58680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 30582 1726855266.58710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 30582 1726855266.58738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 30582 1726855266.58756: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff569e3bf0> <<< 30582 1726855266.58870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56a0c650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0c3b0> <<< 30582 1726855266.59010: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56a0c680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855266.59080: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56a0cfb0> <<< 30582 1726855266.59196: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 30582 1726855266.59212: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56a0d9a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0c860> <<< 30582 1726855266.59308: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff569e1d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 30582 1726855266.59323: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0ed80> <<< 30582 1726855266.59345: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0dac0> <<< 30582 1726855266.59426: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c3ab70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 30582 1726855266.59453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.59467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 30582 1726855266.59500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 30582 1726855266.59534: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a3b110> <<< 30582 1726855266.59588: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 30582 1726855266.59646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 30582 1726855266.59686: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a5b4a0> <<< 30582 1726855266.59709: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 30582 1726855266.59753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 30582 1726855266.59813: stdout chunk (state=3): >>>import 'ntpath' # <<< 30582 1726855266.59853: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56abc290> <<< 30582 1726855266.59866: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 30582 1726855266.59969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 30582 1726855266.60011: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 30582 1726855266.60037: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56abe9f0> <<< 30582 1726855266.60120: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56abc3b0> <<< 30582 1726855266.60194: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a85280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56325370> <<< 30582 1726855266.60207: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a5a2a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0fce0> <<< 30582 1726855266.60324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 30582 1726855266.60345: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7eff56a5a8a0> <<< 30582 1726855266.60622: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_qtsq299z/ansible_stat_payload.zip' <<< 30582 1726855266.60625: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.60905: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 30582 1726855266.60938: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5637b050> <<< 30582 1726855266.60941: stdout chunk (state=3): >>>import '_typing' # <<< 30582 1726855266.61129: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56359f40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563590a0> <<< 30582 1726855266.61140: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.61236: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 30582 1726855266.61303: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.62629: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.63747: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56379340> <<< 30582 1726855266.63762: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.63880: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 30582 1726855266.63886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff563a29c0> <<< 30582 1726855266.63944: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563a2750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563a2060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 30582 1726855266.64150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563a2540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5637bce0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff563a36b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff563a38f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 30582 1726855266.64307: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563a3e00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5620db50> <<< 30582 1726855266.64321: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5620f770> <<< 30582 1726855266.64493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56210170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 30582 1726855266.64524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56211310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 30582 1726855266.64560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 30582 1726855266.65020: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56213da0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56ba6e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56212060> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5621bce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5621a7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5621a510> <<< 30582 1726855266.65023: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5621aa80> <<< 30582 1726855266.65424: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56212570> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56263f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56264080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56265b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56265940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 30582 1726855266.65431: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56268110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56266240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 30582 1726855266.65434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.65470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 30582 1726855266.65473: stdout chunk (state=3): >>>import '_string' # <<< 30582 1726855266.65518: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5626b830> <<< 30582 1726855266.65726: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56268200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5626c650> <<< 30582 1726855266.65793: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5626c890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5626cb60> <<< 30582 1726855266.65820: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56264260> <<< 30582 1726855266.65841: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 30582 1726855266.66091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff562f82f0> <<< 30582 1726855266.66231: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff562f9820> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5626ea80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5626fe30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5626e6c0> # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855266.66244: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 30582 1726855266.66364: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.66409: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.66543: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.66578: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.66595: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 30582 1726855266.66625: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.66653: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.66669: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 30582 1726855266.66685: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.66884: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.67076: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.68536: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.68642: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 30582 1726855266.68769: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff562fd9a0> <<< 30582 1726855266.69525: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 30582 1726855266.69849: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff562fe630> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff562f9940> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff562fe6c0> # zipimport: zlib available <<< 30582 1726855266.70386: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855266.70419: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 30582 1726855266.70524: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.70559: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 30582 1726855266.70586: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.71072: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 30582 1726855266.71093: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.71298: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 30582 1726855266.71343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 30582 1726855266.71364: stdout chunk (state=3): >>>import '_ast' # <<< 30582 1726855266.71424: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff562ff950> <<< 30582 1726855266.71446: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.71515: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.71637: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 30582 1726855266.71662: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.71709: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 30582 1726855266.71807: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855266.71862: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.71929: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 30582 1726855266.71963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.72056: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5610a360> <<< 30582 1726855266.72094: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56105b50><<< 30582 1726855266.72168: stdout chunk (state=3): >>> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 30582 1726855266.72290: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 30582 1726855266.72348: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 30582 1726855266.72396: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 30582 1726855266.72430: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 30582 1726855266.72492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 30582 1726855266.72802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563fec30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563ee900> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5610a3f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5620da30> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 30582 1726855266.72830: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 30582 1726855266.72944: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 30582 1726855266.72961: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.72964: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 30582 1726855266.72965: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.73175: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.73473: stdout chunk (state=3): >>># zipimport: zlib available <<< 30582 1726855266.73634: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 30582 1726855266.74119: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 30582 1726855266.74154: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 30582 1726855266.74253: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct <<< 30582 1726855266.74298: stdout chunk (state=3): >>># cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_ut<<< 30582 1726855266.74347: stdout chunk (state=3): >>>ils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 30582 1726855266.74609: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 30582 1726855266.74638: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 30582 1726855266.74692: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 30582 1726855266.74815: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro <<< 30582 1726855266.74845: stdout chunk (state=3): >>># destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 30582 1726855266.74905: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 30582 1726855266.74983: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 30582 1726855266.75009: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 30582 1726855266.75186: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 30582 1726855266.75216: stdout chunk (state=3): >>># destroy _collections <<< 30582 1726855266.75227: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 30582 1726855266.75247: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 30582 1726855266.75294: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 30582 1726855266.75326: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 30582 1726855266.75461: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 30582 1726855266.75522: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 30582 1726855266.75562: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix <<< 30582 1726855266.75578: stdout chunk (state=3): >>># destroy _functools # destroy builtins # destroy _thread <<< 30582 1726855266.75635: stdout chunk (state=3): >>># clear sys.audit hooks <<< 30582 1726855266.76235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855266.76239: stdout chunk (state=3): >>><<< 30582 1726855266.76242: stderr chunk (state=3): >>><<< 30582 1726855266.76359: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56dbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56d8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56dbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56dcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56dcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56babe30> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56babef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56be3860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56be3ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56bc3b00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56bc1220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56ba8fe0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c037d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c023f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56bc20f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56baa8a0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c38830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56ba8260> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c38ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c38b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c38f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56ba6d80> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c395b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c392b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c3a450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c50650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c51d30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c52bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c53230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c52120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56c53cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c533e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c3a3c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff569e3bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56a0c650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0c3b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56a0c680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56a0cfb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56a0d9a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0c860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff569e1d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0ed80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0dac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56c3ab70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a3b110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a5b4a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56abc290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56abe9f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56abc3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a85280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56325370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a5a2a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56a0fce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7eff56a5a8a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_qtsq299z/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5637b050> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56359f40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563590a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56379340> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff563a29c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563a2750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563a2060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563a2540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5637bce0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff563a36b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff563a38f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563a3e00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5620db50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5620f770> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56210170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56211310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56213da0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56ba6e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56212060> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5621bce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5621a7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5621a510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5621aa80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56212570> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56263f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56264080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56265b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56265940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff56268110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56266240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5626b830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56268200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5626c650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5626c890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5626cb60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56264260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff562f82f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff562f9820> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5626ea80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5626fe30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5626e6c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff562fd9a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff562fe630> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff562f9940> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff562fe6c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff562ff950> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff5610a360> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff56105b50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563fec30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff563ee900> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5610a3f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff5620da30> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 30582 1726855266.78173: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855266.78176: _low_level_execute_command(): starting 30582 1726855266.78179: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855266.2017062-30720-229252172967767/ > /dev/null 2>&1 && sleep 0' 30582 1726855266.78567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855266.78619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855266.78634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855266.78652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855266.78667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855266.78868: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855266.78871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855266.78905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855266.78920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855266.79156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855266.81666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855266.81678: stdout chunk (state=3): >>><<< 30582 1726855266.81705: stderr chunk (state=3): >>><<< 30582 1726855266.81734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855266.81747: handler run complete 30582 1726855266.81772: attempt loop complete, returning result 30582 1726855266.81800: _execute() done 30582 1726855266.81803: dumping result to json 30582 1726855266.81805: done dumping result, returning 30582 1726855266.81893: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0affcc66-ac2b-aa83-7d57-00000000002e] 30582 1726855266.81898: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000002e 30582 1726855266.81968: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000002e 30582 1726855266.81971: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855266.82041: no more pending results, returning what we have 30582 1726855266.82044: results queue empty 30582 1726855266.82045: checking for any_errors_fatal 30582 1726855266.82052: done checking for any_errors_fatal 30582 1726855266.82053: checking for max_fail_percentage 30582 1726855266.82054: done checking for max_fail_percentage 30582 1726855266.82055: checking to see if all hosts have failed and the running result is not ok 30582 1726855266.82056: done checking to see if all hosts have failed 30582 1726855266.82057: getting the remaining hosts for this loop 30582 1726855266.82058: done getting the remaining hosts for this loop 30582 1726855266.82061: getting the next task for host managed_node3 30582 1726855266.82067: done getting next task for host managed_node3 30582 1726855266.82069: ^ task is: TASK: Set flag to indicate system is ostree 30582 1726855266.82072: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855266.82075: getting variables 30582 1726855266.82077: in VariableManager get_vars() 30582 1726855266.82227: Calling all_inventory to load vars for managed_node3 30582 1726855266.82231: Calling groups_inventory to load vars for managed_node3 30582 1726855266.82234: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855266.82245: Calling all_plugins_play to load vars for managed_node3 30582 1726855266.82248: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855266.82250: Calling groups_plugins_play to load vars for managed_node3 30582 1726855266.82680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855266.82982: done with get_vars() 30582 1726855266.82997: done getting variables 30582 1726855266.83106: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 14:01:06 -0400 (0:00:00.683) 0:00:03.181 ****** 30582 1726855266.83135: entering _queue_task() for managed_node3/set_fact 30582 1726855266.83137: Creating lock for set_fact 30582 1726855266.83790: worker is 1 (out of 1 available) 30582 1726855266.83801: exiting _queue_task() for managed_node3/set_fact 30582 1726855266.83898: done queuing things up, now waiting for results queue to drain 30582 1726855266.83900: waiting for pending results... 30582 1726855266.84416: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 30582 1726855266.84421: in run() - task 0affcc66-ac2b-aa83-7d57-00000000002f 30582 1726855266.84424: variable 'ansible_search_path' from source: unknown 30582 1726855266.84427: variable 'ansible_search_path' from source: unknown 30582 1726855266.84436: calling self._execute() 30582 1726855266.84523: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855266.84537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855266.84553: variable 'omit' from source: magic vars 30582 1726855266.85035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855266.85294: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855266.85348: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855266.85392: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855266.85437: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855266.85692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855266.85695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855266.85700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855266.85703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855266.85794: Evaluated conditional (not __network_is_ostree is defined): True 30582 1726855266.85808: variable 'omit' from source: magic vars 30582 1726855266.85848: variable 'omit' from source: magic vars 30582 1726855266.85970: variable '__ostree_booted_stat' from source: set_fact 30582 1726855266.86027: variable 'omit' from source: magic vars 30582 1726855266.86055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855266.86086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855266.86113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855266.86134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855266.86148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855266.86177: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855266.86185: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855266.86195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855266.86321: Set connection var ansible_timeout to 10 30582 1726855266.86332: Set connection var ansible_connection to ssh 30582 1726855266.86344: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855266.86353: Set connection var ansible_pipelining to False 30582 1726855266.86370: Set connection var ansible_shell_executable to /bin/sh 30582 1726855266.86383: Set connection var ansible_shell_type to sh 30582 1726855266.86412: variable 'ansible_shell_executable' from source: unknown 30582 1726855266.86419: variable 'ansible_connection' from source: unknown 30582 1726855266.86425: variable 'ansible_module_compression' from source: unknown 30582 1726855266.86435: variable 'ansible_shell_type' from source: unknown 30582 1726855266.86442: variable 'ansible_shell_executable' from source: unknown 30582 1726855266.86448: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855266.86455: variable 'ansible_pipelining' from source: unknown 30582 1726855266.86461: variable 'ansible_timeout' from source: unknown 30582 1726855266.86468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855266.86621: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855266.86636: variable 'omit' from source: magic vars 30582 1726855266.86645: starting attempt loop 30582 1726855266.86658: running the handler 30582 1726855266.86673: handler run complete 30582 1726855266.86689: attempt loop complete, returning result 30582 1726855266.86707: _execute() done 30582 1726855266.86716: dumping result to json 30582 1726855266.86726: done dumping result, returning 30582 1726855266.86737: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-00000000002f] 30582 1726855266.86764: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000002f ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 30582 1726855266.86927: no more pending results, returning what we have 30582 1726855266.86930: results queue empty 30582 1726855266.86931: checking for any_errors_fatal 30582 1726855266.86938: done checking for any_errors_fatal 30582 1726855266.86940: checking for max_fail_percentage 30582 1726855266.86942: done checking for max_fail_percentage 30582 1726855266.86951: checking to see if all hosts have failed and the running result is not ok 30582 1726855266.86952: done checking to see if all hosts have failed 30582 1726855266.86953: getting the remaining hosts for this loop 30582 1726855266.86955: done getting the remaining hosts for this loop 30582 1726855266.86959: getting the next task for host managed_node3 30582 1726855266.86969: done getting next task for host managed_node3 30582 1726855266.86971: ^ task is: TASK: Fix CentOS6 Base repo 30582 1726855266.86974: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855266.86978: getting variables 30582 1726855266.86979: in VariableManager get_vars() 30582 1726855266.87015: Calling all_inventory to load vars for managed_node3 30582 1726855266.87018: Calling groups_inventory to load vars for managed_node3 30582 1726855266.87022: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855266.87033: Calling all_plugins_play to load vars for managed_node3 30582 1726855266.87036: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855266.87039: Calling groups_plugins_play to load vars for managed_node3 30582 1726855266.87593: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000002f 30582 1726855266.87604: WORKER PROCESS EXITING 30582 1726855266.87627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855266.88044: done with get_vars() 30582 1726855266.88057: done getting variables 30582 1726855266.88282: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 14:01:06 -0400 (0:00:00.051) 0:00:03.233 ****** 30582 1726855266.88314: entering _queue_task() for managed_node3/copy 30582 1726855266.88932: worker is 1 (out of 1 available) 30582 1726855266.88949: exiting _queue_task() for managed_node3/copy 30582 1726855266.88965: done queuing things up, now waiting for results queue to drain 30582 1726855266.88967: waiting for pending results... 30582 1726855266.89150: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 30582 1726855266.89273: in run() - task 0affcc66-ac2b-aa83-7d57-000000000031 30582 1726855266.89296: variable 'ansible_search_path' from source: unknown 30582 1726855266.89304: variable 'ansible_search_path' from source: unknown 30582 1726855266.89348: calling self._execute() 30582 1726855266.89424: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855266.89435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855266.89456: variable 'omit' from source: magic vars 30582 1726855266.90018: variable 'ansible_distribution' from source: facts 30582 1726855266.90048: Evaluated conditional (ansible_distribution == 'CentOS'): True 30582 1726855266.90179: variable 'ansible_distribution_major_version' from source: facts 30582 1726855266.90192: Evaluated conditional (ansible_distribution_major_version == '6'): False 30582 1726855266.90201: when evaluation is False, skipping this task 30582 1726855266.90242: _execute() done 30582 1726855266.90245: dumping result to json 30582 1726855266.90248: done dumping result, returning 30582 1726855266.90250: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0affcc66-ac2b-aa83-7d57-000000000031] 30582 1726855266.90253: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000031 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30582 1726855266.90532: no more pending results, returning what we have 30582 1726855266.90535: results queue empty 30582 1726855266.90536: checking for any_errors_fatal 30582 1726855266.90540: done checking for any_errors_fatal 30582 1726855266.90541: checking for max_fail_percentage 30582 1726855266.90543: done checking for max_fail_percentage 30582 1726855266.90544: checking to see if all hosts have failed and the running result is not ok 30582 1726855266.90545: done checking to see if all hosts have failed 30582 1726855266.90545: getting the remaining hosts for this loop 30582 1726855266.90547: done getting the remaining hosts for this loop 30582 1726855266.90550: getting the next task for host managed_node3 30582 1726855266.90557: done getting next task for host managed_node3 30582 1726855266.90675: ^ task is: TASK: Include the task 'enable_epel.yml' 30582 1726855266.90679: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855266.90684: getting variables 30582 1726855266.90686: in VariableManager get_vars() 30582 1726855266.90716: Calling all_inventory to load vars for managed_node3 30582 1726855266.90719: Calling groups_inventory to load vars for managed_node3 30582 1726855266.90722: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855266.90732: Calling all_plugins_play to load vars for managed_node3 30582 1726855266.90735: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855266.90738: Calling groups_plugins_play to load vars for managed_node3 30582 1726855266.91250: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000031 30582 1726855266.91254: WORKER PROCESS EXITING 30582 1726855266.91281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855266.91532: done with get_vars() 30582 1726855266.91541: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 14:01:06 -0400 (0:00:00.033) 0:00:03.266 ****** 30582 1726855266.91671: entering _queue_task() for managed_node3/include_tasks 30582 1726855266.91963: worker is 1 (out of 1 available) 30582 1726855266.91977: exiting _queue_task() for managed_node3/include_tasks 30582 1726855266.91989: done queuing things up, now waiting for results queue to drain 30582 1726855266.91990: waiting for pending results... 30582 1726855266.92172: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 30582 1726855266.92306: in run() - task 0affcc66-ac2b-aa83-7d57-000000000032 30582 1726855266.92310: variable 'ansible_search_path' from source: unknown 30582 1726855266.92313: variable 'ansible_search_path' from source: unknown 30582 1726855266.92322: calling self._execute() 30582 1726855266.92402: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855266.92417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855266.92431: variable 'omit' from source: magic vars 30582 1726855266.92940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855266.94958: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855266.95193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855266.95197: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855266.95200: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855266.95202: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855266.95228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855266.95274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855266.95311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855266.95366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855266.95385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855266.95602: variable '__network_is_ostree' from source: set_fact 30582 1726855266.95636: Evaluated conditional (not __network_is_ostree | d(false)): True 30582 1726855266.95645: _execute() done 30582 1726855266.95651: dumping result to json 30582 1726855266.95658: done dumping result, returning 30582 1726855266.95669: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0affcc66-ac2b-aa83-7d57-000000000032] 30582 1726855266.95702: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000032 30582 1726855266.95846: no more pending results, returning what we have 30582 1726855266.95851: in VariableManager get_vars() 30582 1726855266.95881: Calling all_inventory to load vars for managed_node3 30582 1726855266.95884: Calling groups_inventory to load vars for managed_node3 30582 1726855266.95890: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855266.95902: Calling all_plugins_play to load vars for managed_node3 30582 1726855266.95905: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855266.95908: Calling groups_plugins_play to load vars for managed_node3 30582 1726855266.96093: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000032 30582 1726855266.96097: WORKER PROCESS EXITING 30582 1726855266.96111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855266.96433: done with get_vars() 30582 1726855266.96441: variable 'ansible_search_path' from source: unknown 30582 1726855266.96443: variable 'ansible_search_path' from source: unknown 30582 1726855266.96476: we have included files to process 30582 1726855266.96478: generating all_blocks data 30582 1726855266.96479: done generating all_blocks data 30582 1726855266.96482: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30582 1726855266.96484: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30582 1726855266.96486: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30582 1726855266.97174: done processing included file 30582 1726855266.97176: iterating over new_blocks loaded from include file 30582 1726855266.97178: in VariableManager get_vars() 30582 1726855266.97191: done with get_vars() 30582 1726855266.97193: filtering new block on tags 30582 1726855266.97215: done filtering new block on tags 30582 1726855266.97218: in VariableManager get_vars() 30582 1726855266.97229: done with get_vars() 30582 1726855266.97230: filtering new block on tags 30582 1726855266.97241: done filtering new block on tags 30582 1726855266.97243: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 30582 1726855266.97248: extending task lists for all hosts with included blocks 30582 1726855266.97343: done extending task lists 30582 1726855266.97345: done processing included files 30582 1726855266.97346: results queue empty 30582 1726855266.97346: checking for any_errors_fatal 30582 1726855266.97348: done checking for any_errors_fatal 30582 1726855266.97349: checking for max_fail_percentage 30582 1726855266.97350: done checking for max_fail_percentage 30582 1726855266.97351: checking to see if all hosts have failed and the running result is not ok 30582 1726855266.97352: done checking to see if all hosts have failed 30582 1726855266.97352: getting the remaining hosts for this loop 30582 1726855266.97353: done getting the remaining hosts for this loop 30582 1726855266.97355: getting the next task for host managed_node3 30582 1726855266.97359: done getting next task for host managed_node3 30582 1726855266.97361: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 30582 1726855266.97363: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855266.97365: getting variables 30582 1726855266.97366: in VariableManager get_vars() 30582 1726855266.97374: Calling all_inventory to load vars for managed_node3 30582 1726855266.97376: Calling groups_inventory to load vars for managed_node3 30582 1726855266.97378: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855266.97383: Calling all_plugins_play to load vars for managed_node3 30582 1726855266.97392: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855266.97395: Calling groups_plugins_play to load vars for managed_node3 30582 1726855266.97556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855266.97737: done with get_vars() 30582 1726855266.97743: done getting variables 30582 1726855266.97789: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 30582 1726855266.97933: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 14:01:06 -0400 (0:00:00.062) 0:00:03.329 ****** 30582 1726855266.97966: entering _queue_task() for managed_node3/command 30582 1726855266.97967: Creating lock for command 30582 1726855266.98181: worker is 1 (out of 1 available) 30582 1726855266.98195: exiting _queue_task() for managed_node3/command 30582 1726855266.98206: done queuing things up, now waiting for results queue to drain 30582 1726855266.98207: waiting for pending results... 30582 1726855266.98348: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 30582 1726855266.98414: in run() - task 0affcc66-ac2b-aa83-7d57-00000000004c 30582 1726855266.98424: variable 'ansible_search_path' from source: unknown 30582 1726855266.98429: variable 'ansible_search_path' from source: unknown 30582 1726855266.98456: calling self._execute() 30582 1726855266.98513: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855266.98517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855266.98525: variable 'omit' from source: magic vars 30582 1726855266.98784: variable 'ansible_distribution' from source: facts 30582 1726855266.98795: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30582 1726855266.98880: variable 'ansible_distribution_major_version' from source: facts 30582 1726855266.98891: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30582 1726855266.98894: when evaluation is False, skipping this task 30582 1726855266.98901: _execute() done 30582 1726855266.98903: dumping result to json 30582 1726855266.98906: done dumping result, returning 30582 1726855266.98912: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [0affcc66-ac2b-aa83-7d57-00000000004c] 30582 1726855266.98916: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000004c 30582 1726855266.99001: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000004c 30582 1726855266.99004: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30582 1726855266.99050: no more pending results, returning what we have 30582 1726855266.99053: results queue empty 30582 1726855266.99054: checking for any_errors_fatal 30582 1726855266.99055: done checking for any_errors_fatal 30582 1726855266.99055: checking for max_fail_percentage 30582 1726855266.99057: done checking for max_fail_percentage 30582 1726855266.99058: checking to see if all hosts have failed and the running result is not ok 30582 1726855266.99058: done checking to see if all hosts have failed 30582 1726855266.99059: getting the remaining hosts for this loop 30582 1726855266.99060: done getting the remaining hosts for this loop 30582 1726855266.99063: getting the next task for host managed_node3 30582 1726855266.99068: done getting next task for host managed_node3 30582 1726855266.99071: ^ task is: TASK: Install yum-utils package 30582 1726855266.99074: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855266.99077: getting variables 30582 1726855266.99078: in VariableManager get_vars() 30582 1726855266.99107: Calling all_inventory to load vars for managed_node3 30582 1726855266.99109: Calling groups_inventory to load vars for managed_node3 30582 1726855266.99112: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855266.99120: Calling all_plugins_play to load vars for managed_node3 30582 1726855266.99122: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855266.99125: Calling groups_plugins_play to load vars for managed_node3 30582 1726855266.99249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855266.99466: done with get_vars() 30582 1726855266.99473: done getting variables 30582 1726855266.99557: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 14:01:06 -0400 (0:00:00.016) 0:00:03.345 ****** 30582 1726855266.99576: entering _queue_task() for managed_node3/package 30582 1726855266.99577: Creating lock for package 30582 1726855266.99813: worker is 1 (out of 1 available) 30582 1726855266.99824: exiting _queue_task() for managed_node3/package 30582 1726855266.99836: done queuing things up, now waiting for results queue to drain 30582 1726855266.99837: waiting for pending results... 30582 1726855267.00066: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 30582 1726855267.00272: in run() - task 0affcc66-ac2b-aa83-7d57-00000000004d 30582 1726855267.00276: variable 'ansible_search_path' from source: unknown 30582 1726855267.00278: variable 'ansible_search_path' from source: unknown 30582 1726855267.00280: calling self._execute() 30582 1726855267.00318: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.00328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.00340: variable 'omit' from source: magic vars 30582 1726855267.00657: variable 'ansible_distribution' from source: facts 30582 1726855267.00661: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30582 1726855267.00893: variable 'ansible_distribution_major_version' from source: facts 30582 1726855267.00896: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30582 1726855267.00902: when evaluation is False, skipping this task 30582 1726855267.00905: _execute() done 30582 1726855267.00907: dumping result to json 30582 1726855267.00909: done dumping result, returning 30582 1726855267.00912: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0affcc66-ac2b-aa83-7d57-00000000004d] 30582 1726855267.00914: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000004d 30582 1726855267.00970: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000004d 30582 1726855267.00973: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30582 1726855267.01036: no more pending results, returning what we have 30582 1726855267.01039: results queue empty 30582 1726855267.01039: checking for any_errors_fatal 30582 1726855267.01044: done checking for any_errors_fatal 30582 1726855267.01044: checking for max_fail_percentage 30582 1726855267.01046: done checking for max_fail_percentage 30582 1726855267.01046: checking to see if all hosts have failed and the running result is not ok 30582 1726855267.01047: done checking to see if all hosts have failed 30582 1726855267.01047: getting the remaining hosts for this loop 30582 1726855267.01048: done getting the remaining hosts for this loop 30582 1726855267.01051: getting the next task for host managed_node3 30582 1726855267.01056: done getting next task for host managed_node3 30582 1726855267.01058: ^ task is: TASK: Enable EPEL 7 30582 1726855267.01061: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855267.01064: getting variables 30582 1726855267.01065: in VariableManager get_vars() 30582 1726855267.01101: Calling all_inventory to load vars for managed_node3 30582 1726855267.01104: Calling groups_inventory to load vars for managed_node3 30582 1726855267.01107: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855267.01115: Calling all_plugins_play to load vars for managed_node3 30582 1726855267.01118: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855267.01121: Calling groups_plugins_play to load vars for managed_node3 30582 1726855267.01303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855267.01533: done with get_vars() 30582 1726855267.01542: done getting variables 30582 1726855267.01595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 14:01:07 -0400 (0:00:00.020) 0:00:03.366 ****** 30582 1726855267.01622: entering _queue_task() for managed_node3/command 30582 1726855267.01944: worker is 1 (out of 1 available) 30582 1726855267.01954: exiting _queue_task() for managed_node3/command 30582 1726855267.01965: done queuing things up, now waiting for results queue to drain 30582 1726855267.01966: waiting for pending results... 30582 1726855267.02125: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 30582 1726855267.02197: in run() - task 0affcc66-ac2b-aa83-7d57-00000000004e 30582 1726855267.02201: variable 'ansible_search_path' from source: unknown 30582 1726855267.02204: variable 'ansible_search_path' from source: unknown 30582 1726855267.02228: calling self._execute() 30582 1726855267.02274: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.02277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.02285: variable 'omit' from source: magic vars 30582 1726855267.02542: variable 'ansible_distribution' from source: facts 30582 1726855267.02551: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30582 1726855267.02641: variable 'ansible_distribution_major_version' from source: facts 30582 1726855267.02645: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30582 1726855267.02648: when evaluation is False, skipping this task 30582 1726855267.02651: _execute() done 30582 1726855267.02654: dumping result to json 30582 1726855267.02659: done dumping result, returning 30582 1726855267.02664: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0affcc66-ac2b-aa83-7d57-00000000004e] 30582 1726855267.02669: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000004e 30582 1726855267.02747: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000004e 30582 1726855267.02750: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30582 1726855267.02796: no more pending results, returning what we have 30582 1726855267.02799: results queue empty 30582 1726855267.02800: checking for any_errors_fatal 30582 1726855267.02803: done checking for any_errors_fatal 30582 1726855267.02803: checking for max_fail_percentage 30582 1726855267.02805: done checking for max_fail_percentage 30582 1726855267.02806: checking to see if all hosts have failed and the running result is not ok 30582 1726855267.02806: done checking to see if all hosts have failed 30582 1726855267.02807: getting the remaining hosts for this loop 30582 1726855267.02808: done getting the remaining hosts for this loop 30582 1726855267.02811: getting the next task for host managed_node3 30582 1726855267.02815: done getting next task for host managed_node3 30582 1726855267.02817: ^ task is: TASK: Enable EPEL 8 30582 1726855267.02821: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855267.02824: getting variables 30582 1726855267.02825: in VariableManager get_vars() 30582 1726855267.02847: Calling all_inventory to load vars for managed_node3 30582 1726855267.02850: Calling groups_inventory to load vars for managed_node3 30582 1726855267.02852: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855267.02862: Calling all_plugins_play to load vars for managed_node3 30582 1726855267.02864: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855267.02867: Calling groups_plugins_play to load vars for managed_node3 30582 1726855267.03007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855267.03128: done with get_vars() 30582 1726855267.03134: done getting variables 30582 1726855267.03170: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 14:01:07 -0400 (0:00:00.015) 0:00:03.381 ****** 30582 1726855267.03191: entering _queue_task() for managed_node3/command 30582 1726855267.03361: worker is 1 (out of 1 available) 30582 1726855267.03374: exiting _queue_task() for managed_node3/command 30582 1726855267.03385: done queuing things up, now waiting for results queue to drain 30582 1726855267.03389: waiting for pending results... 30582 1726855267.03519: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 30582 1726855267.03578: in run() - task 0affcc66-ac2b-aa83-7d57-00000000004f 30582 1726855267.03589: variable 'ansible_search_path' from source: unknown 30582 1726855267.03593: variable 'ansible_search_path' from source: unknown 30582 1726855267.03622: calling self._execute() 30582 1726855267.03672: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.03676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.03684: variable 'omit' from source: magic vars 30582 1726855267.04093: variable 'ansible_distribution' from source: facts 30582 1726855267.04096: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30582 1726855267.04164: variable 'ansible_distribution_major_version' from source: facts 30582 1726855267.04175: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30582 1726855267.04182: when evaluation is False, skipping this task 30582 1726855267.04195: _execute() done 30582 1726855267.04206: dumping result to json 30582 1726855267.04214: done dumping result, returning 30582 1726855267.04223: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0affcc66-ac2b-aa83-7d57-00000000004f] 30582 1726855267.04232: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000004f skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30582 1726855267.04363: no more pending results, returning what we have 30582 1726855267.04367: results queue empty 30582 1726855267.04367: checking for any_errors_fatal 30582 1726855267.04372: done checking for any_errors_fatal 30582 1726855267.04373: checking for max_fail_percentage 30582 1726855267.04374: done checking for max_fail_percentage 30582 1726855267.04375: checking to see if all hosts have failed and the running result is not ok 30582 1726855267.04376: done checking to see if all hosts have failed 30582 1726855267.04376: getting the remaining hosts for this loop 30582 1726855267.04378: done getting the remaining hosts for this loop 30582 1726855267.04381: getting the next task for host managed_node3 30582 1726855267.04393: done getting next task for host managed_node3 30582 1726855267.04395: ^ task is: TASK: Enable EPEL 6 30582 1726855267.04402: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855267.04406: getting variables 30582 1726855267.04408: in VariableManager get_vars() 30582 1726855267.04437: Calling all_inventory to load vars for managed_node3 30582 1726855267.04439: Calling groups_inventory to load vars for managed_node3 30582 1726855267.04443: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855267.04454: Calling all_plugins_play to load vars for managed_node3 30582 1726855267.04457: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855267.04460: Calling groups_plugins_play to load vars for managed_node3 30582 1726855267.04849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855267.04992: done with get_vars() 30582 1726855267.05001: done getting variables 30582 1726855267.05030: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000004f 30582 1726855267.05034: WORKER PROCESS EXITING 30582 1726855267.05052: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 14:01:07 -0400 (0:00:00.018) 0:00:03.400 ****** 30582 1726855267.05071: entering _queue_task() for managed_node3/copy 30582 1726855267.05259: worker is 1 (out of 1 available) 30582 1726855267.05272: exiting _queue_task() for managed_node3/copy 30582 1726855267.05282: done queuing things up, now waiting for results queue to drain 30582 1726855267.05284: waiting for pending results... 30582 1726855267.05430: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 30582 1726855267.05488: in run() - task 0affcc66-ac2b-aa83-7d57-000000000051 30582 1726855267.05498: variable 'ansible_search_path' from source: unknown 30582 1726855267.05501: variable 'ansible_search_path' from source: unknown 30582 1726855267.05531: calling self._execute() 30582 1726855267.05581: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.05585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.05594: variable 'omit' from source: magic vars 30582 1726855267.06093: variable 'ansible_distribution' from source: facts 30582 1726855267.06096: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30582 1726855267.06099: variable 'ansible_distribution_major_version' from source: facts 30582 1726855267.06101: Evaluated conditional (ansible_distribution_major_version == '6'): False 30582 1726855267.06103: when evaluation is False, skipping this task 30582 1726855267.06105: _execute() done 30582 1726855267.06107: dumping result to json 30582 1726855267.06109: done dumping result, returning 30582 1726855267.06112: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0affcc66-ac2b-aa83-7d57-000000000051] 30582 1726855267.06114: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000051 30582 1726855267.06202: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000051 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30582 1726855267.06277: no more pending results, returning what we have 30582 1726855267.06280: results queue empty 30582 1726855267.06281: checking for any_errors_fatal 30582 1726855267.06286: done checking for any_errors_fatal 30582 1726855267.06290: checking for max_fail_percentage 30582 1726855267.06292: done checking for max_fail_percentage 30582 1726855267.06292: checking to see if all hosts have failed and the running result is not ok 30582 1726855267.06293: done checking to see if all hosts have failed 30582 1726855267.06294: getting the remaining hosts for this loop 30582 1726855267.06296: done getting the remaining hosts for this loop 30582 1726855267.06299: getting the next task for host managed_node3 30582 1726855267.06311: done getting next task for host managed_node3 30582 1726855267.06314: ^ task is: TASK: Set network provider to 'nm' 30582 1726855267.06317: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855267.06321: getting variables 30582 1726855267.06323: in VariableManager get_vars() 30582 1726855267.06475: Calling all_inventory to load vars for managed_node3 30582 1726855267.06479: Calling groups_inventory to load vars for managed_node3 30582 1726855267.06482: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855267.06491: WORKER PROCESS EXITING 30582 1726855267.06503: Calling all_plugins_play to load vars for managed_node3 30582 1726855267.06507: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855267.06511: Calling groups_plugins_play to load vars for managed_node3 30582 1726855267.06866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855267.07093: done with get_vars() 30582 1726855267.07113: done getting variables 30582 1726855267.07172: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:13 Friday 20 September 2024 14:01:07 -0400 (0:00:00.021) 0:00:03.421 ****** 30582 1726855267.07201: entering _queue_task() for managed_node3/set_fact 30582 1726855267.07465: worker is 1 (out of 1 available) 30582 1726855267.07476: exiting _queue_task() for managed_node3/set_fact 30582 1726855267.07492: done queuing things up, now waiting for results queue to drain 30582 1726855267.07494: waiting for pending results... 30582 1726855267.07754: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 30582 1726855267.07854: in run() - task 0affcc66-ac2b-aa83-7d57-000000000007 30582 1726855267.07892: variable 'ansible_search_path' from source: unknown 30582 1726855267.07934: calling self._execute() 30582 1726855267.08100: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.08104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.08108: variable 'omit' from source: magic vars 30582 1726855267.08171: variable 'omit' from source: magic vars 30582 1726855267.08312: variable 'omit' from source: magic vars 30582 1726855267.08317: variable 'omit' from source: magic vars 30582 1726855267.08320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855267.08366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855267.08397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855267.08433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855267.08546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855267.08550: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855267.08552: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.08555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.08633: Set connection var ansible_timeout to 10 30582 1726855267.08643: Set connection var ansible_connection to ssh 30582 1726855267.08674: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855267.08764: Set connection var ansible_pipelining to False 30582 1726855267.08767: Set connection var ansible_shell_executable to /bin/sh 30582 1726855267.08769: Set connection var ansible_shell_type to sh 30582 1726855267.08771: variable 'ansible_shell_executable' from source: unknown 30582 1726855267.08774: variable 'ansible_connection' from source: unknown 30582 1726855267.08776: variable 'ansible_module_compression' from source: unknown 30582 1726855267.08778: variable 'ansible_shell_type' from source: unknown 30582 1726855267.08779: variable 'ansible_shell_executable' from source: unknown 30582 1726855267.08782: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.08784: variable 'ansible_pipelining' from source: unknown 30582 1726855267.08785: variable 'ansible_timeout' from source: unknown 30582 1726855267.08789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.09019: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855267.09023: variable 'omit' from source: magic vars 30582 1726855267.09025: starting attempt loop 30582 1726855267.09027: running the handler 30582 1726855267.09030: handler run complete 30582 1726855267.09032: attempt loop complete, returning result 30582 1726855267.09034: _execute() done 30582 1726855267.09036: dumping result to json 30582 1726855267.09038: done dumping result, returning 30582 1726855267.09039: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0affcc66-ac2b-aa83-7d57-000000000007] 30582 1726855267.09041: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000007 ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 30582 1726855267.09291: no more pending results, returning what we have 30582 1726855267.09294: results queue empty 30582 1726855267.09295: checking for any_errors_fatal 30582 1726855267.09302: done checking for any_errors_fatal 30582 1726855267.09305: checking for max_fail_percentage 30582 1726855267.09308: done checking for max_fail_percentage 30582 1726855267.09308: checking to see if all hosts have failed and the running result is not ok 30582 1726855267.09309: done checking to see if all hosts have failed 30582 1726855267.09310: getting the remaining hosts for this loop 30582 1726855267.09311: done getting the remaining hosts for this loop 30582 1726855267.09316: getting the next task for host managed_node3 30582 1726855267.09323: done getting next task for host managed_node3 30582 1726855267.09325: ^ task is: TASK: meta (flush_handlers) 30582 1726855267.09327: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855267.09331: getting variables 30582 1726855267.09333: in VariableManager get_vars() 30582 1726855267.09366: Calling all_inventory to load vars for managed_node3 30582 1726855267.09369: Calling groups_inventory to load vars for managed_node3 30582 1726855267.09372: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855267.09384: Calling all_plugins_play to load vars for managed_node3 30582 1726855267.09505: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855267.09512: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000007 30582 1726855267.09515: WORKER PROCESS EXITING 30582 1726855267.09520: Calling groups_plugins_play to load vars for managed_node3 30582 1726855267.09785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855267.10018: done with get_vars() 30582 1726855267.10028: done getting variables 30582 1726855267.10109: in VariableManager get_vars() 30582 1726855267.10119: Calling all_inventory to load vars for managed_node3 30582 1726855267.10121: Calling groups_inventory to load vars for managed_node3 30582 1726855267.10123: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855267.10127: Calling all_plugins_play to load vars for managed_node3 30582 1726855267.10129: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855267.10132: Calling groups_plugins_play to load vars for managed_node3 30582 1726855267.10316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855267.10438: done with get_vars() 30582 1726855267.10448: done queuing things up, now waiting for results queue to drain 30582 1726855267.10449: results queue empty 30582 1726855267.10450: checking for any_errors_fatal 30582 1726855267.10451: done checking for any_errors_fatal 30582 1726855267.10452: checking for max_fail_percentage 30582 1726855267.10452: done checking for max_fail_percentage 30582 1726855267.10453: checking to see if all hosts have failed and the running result is not ok 30582 1726855267.10453: done checking to see if all hosts have failed 30582 1726855267.10454: getting the remaining hosts for this loop 30582 1726855267.10454: done getting the remaining hosts for this loop 30582 1726855267.10456: getting the next task for host managed_node3 30582 1726855267.10458: done getting next task for host managed_node3 30582 1726855267.10459: ^ task is: TASK: meta (flush_handlers) 30582 1726855267.10459: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855267.10465: getting variables 30582 1726855267.10466: in VariableManager get_vars() 30582 1726855267.10471: Calling all_inventory to load vars for managed_node3 30582 1726855267.10476: Calling groups_inventory to load vars for managed_node3 30582 1726855267.10478: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855267.10482: Calling all_plugins_play to load vars for managed_node3 30582 1726855267.10483: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855267.10485: Calling groups_plugins_play to load vars for managed_node3 30582 1726855267.10569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855267.10691: done with get_vars() 30582 1726855267.10701: done getting variables 30582 1726855267.10729: in VariableManager get_vars() 30582 1726855267.10734: Calling all_inventory to load vars for managed_node3 30582 1726855267.10736: Calling groups_inventory to load vars for managed_node3 30582 1726855267.10737: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855267.10740: Calling all_plugins_play to load vars for managed_node3 30582 1726855267.10741: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855267.10743: Calling groups_plugins_play to load vars for managed_node3 30582 1726855267.10845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855267.10960: done with get_vars() 30582 1726855267.10968: done queuing things up, now waiting for results queue to drain 30582 1726855267.10969: results queue empty 30582 1726855267.10969: checking for any_errors_fatal 30582 1726855267.10970: done checking for any_errors_fatal 30582 1726855267.10970: checking for max_fail_percentage 30582 1726855267.10971: done checking for max_fail_percentage 30582 1726855267.10972: checking to see if all hosts have failed and the running result is not ok 30582 1726855267.10972: done checking to see if all hosts have failed 30582 1726855267.10972: getting the remaining hosts for this loop 30582 1726855267.10973: done getting the remaining hosts for this loop 30582 1726855267.10974: getting the next task for host managed_node3 30582 1726855267.10976: done getting next task for host managed_node3 30582 1726855267.10977: ^ task is: None 30582 1726855267.10977: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855267.10978: done queuing things up, now waiting for results queue to drain 30582 1726855267.10979: results queue empty 30582 1726855267.10979: checking for any_errors_fatal 30582 1726855267.10980: done checking for any_errors_fatal 30582 1726855267.10980: checking for max_fail_percentage 30582 1726855267.10980: done checking for max_fail_percentage 30582 1726855267.10981: checking to see if all hosts have failed and the running result is not ok 30582 1726855267.10981: done checking to see if all hosts have failed 30582 1726855267.10982: getting the next task for host managed_node3 30582 1726855267.10984: done getting next task for host managed_node3 30582 1726855267.10984: ^ task is: None 30582 1726855267.10985: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855267.11025: in VariableManager get_vars() 30582 1726855267.11036: done with get_vars() 30582 1726855267.11040: in VariableManager get_vars() 30582 1726855267.11046: done with get_vars() 30582 1726855267.11048: variable 'omit' from source: magic vars 30582 1726855267.11068: in VariableManager get_vars() 30582 1726855267.11074: done with get_vars() 30582 1726855267.11090: variable 'omit' from source: magic vars PLAY [Play for testing states] ************************************************* 30582 1726855267.11282: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 30582 1726855267.11306: getting the remaining hosts for this loop 30582 1726855267.11307: done getting the remaining hosts for this loop 30582 1726855267.11309: getting the next task for host managed_node3 30582 1726855267.11310: done getting next task for host managed_node3 30582 1726855267.11312: ^ task is: TASK: Gathering Facts 30582 1726855267.11313: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855267.11314: getting variables 30582 1726855267.11314: in VariableManager get_vars() 30582 1726855267.11320: Calling all_inventory to load vars for managed_node3 30582 1726855267.11321: Calling groups_inventory to load vars for managed_node3 30582 1726855267.11323: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855267.11326: Calling all_plugins_play to load vars for managed_node3 30582 1726855267.11334: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855267.11336: Calling groups_plugins_play to load vars for managed_node3 30582 1726855267.11428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855267.11568: done with get_vars() 30582 1726855267.11575: done getting variables 30582 1726855267.11604: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 Friday 20 September 2024 14:01:07 -0400 (0:00:00.044) 0:00:03.466 ****** 30582 1726855267.11619: entering _queue_task() for managed_node3/gather_facts 30582 1726855267.11800: worker is 1 (out of 1 available) 30582 1726855267.11810: exiting _queue_task() for managed_node3/gather_facts 30582 1726855267.11822: done queuing things up, now waiting for results queue to drain 30582 1726855267.11823: waiting for pending results... 30582 1726855267.11962: running TaskExecutor() for managed_node3/TASK: Gathering Facts 30582 1726855267.12020: in run() - task 0affcc66-ac2b-aa83-7d57-000000000077 30582 1726855267.12025: variable 'ansible_search_path' from source: unknown 30582 1726855267.12051: calling self._execute() 30582 1726855267.12106: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.12109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.12119: variable 'omit' from source: magic vars 30582 1726855267.12373: variable 'ansible_distribution_major_version' from source: facts 30582 1726855267.12382: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855267.12390: variable 'omit' from source: magic vars 30582 1726855267.12407: variable 'omit' from source: magic vars 30582 1726855267.12431: variable 'omit' from source: magic vars 30582 1726855267.12461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855267.12490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855267.12504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855267.12517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855267.12526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855267.12547: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855267.12550: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.12553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.12624: Set connection var ansible_timeout to 10 30582 1726855267.12628: Set connection var ansible_connection to ssh 30582 1726855267.12632: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855267.12637: Set connection var ansible_pipelining to False 30582 1726855267.12679: Set connection var ansible_shell_executable to /bin/sh 30582 1726855267.12682: Set connection var ansible_shell_type to sh 30582 1726855267.12684: variable 'ansible_shell_executable' from source: unknown 30582 1726855267.12701: variable 'ansible_connection' from source: unknown 30582 1726855267.12704: variable 'ansible_module_compression' from source: unknown 30582 1726855267.12707: variable 'ansible_shell_type' from source: unknown 30582 1726855267.12709: variable 'ansible_shell_executable' from source: unknown 30582 1726855267.12711: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855267.12713: variable 'ansible_pipelining' from source: unknown 30582 1726855267.12715: variable 'ansible_timeout' from source: unknown 30582 1726855267.12717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855267.12843: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855267.12852: variable 'omit' from source: magic vars 30582 1726855267.12857: starting attempt loop 30582 1726855267.12859: running the handler 30582 1726855267.12871: variable 'ansible_facts' from source: unknown 30582 1726855267.12889: _low_level_execute_command(): starting 30582 1726855267.12895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855267.13401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855267.13419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855267.13456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855267.13468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855267.13543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855267.15894: stdout chunk (state=3): >>>/root <<< 30582 1726855267.16034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855267.16060: stderr chunk (state=3): >>><<< 30582 1726855267.16064: stdout chunk (state=3): >>><<< 30582 1726855267.16086: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855267.16102: _low_level_execute_command(): starting 30582 1726855267.16108: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820 `" && echo ansible-tmp-1726855267.1608496-30780-61941873953820="` echo /root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820 `" ) && sleep 0' 30582 1726855267.16552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855267.16555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855267.16558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855267.16567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855267.16610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855267.16614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855267.16686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855267.19455: stdout chunk (state=3): >>>ansible-tmp-1726855267.1608496-30780-61941873953820=/root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820 <<< 30582 1726855267.19679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855267.19683: stdout chunk (state=3): >>><<< 30582 1726855267.19685: stderr chunk (state=3): >>><<< 30582 1726855267.19893: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855267.1608496-30780-61941873953820=/root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855267.19897: variable 'ansible_module_compression' from source: unknown 30582 1726855267.19899: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30582 1726855267.19902: variable 'ansible_facts' from source: unknown 30582 1726855267.20109: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/AnsiballZ_setup.py 30582 1726855267.20375: Sending initial data 30582 1726855267.20378: Sent initial data (153 bytes) 30582 1726855267.21235: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855267.21311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855267.21342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855267.21374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855267.21489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855267.23693: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855267.23732: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855267.23795: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855267.23875: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppr3x1l3o /root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/AnsiballZ_setup.py <<< 30582 1726855267.23879: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/AnsiballZ_setup.py" <<< 30582 1726855267.23934: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppr3x1l3o" to remote "/root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/AnsiballZ_setup.py" <<< 30582 1726855267.25712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855267.25716: stdout chunk (state=3): >>><<< 30582 1726855267.25718: stderr chunk (state=3): >>><<< 30582 1726855267.25721: done transferring module to remote 30582 1726855267.25723: _low_level_execute_command(): starting 30582 1726855267.25725: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/ /root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/AnsiballZ_setup.py && sleep 0' 30582 1726855267.26581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855267.26586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855267.26640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855267.26665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855267.26775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855267.29326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855267.29410: stderr chunk (state=3): >>><<< 30582 1726855267.29414: stdout chunk (state=3): >>><<< 30582 1726855267.29502: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855267.29505: _low_level_execute_command(): starting 30582 1726855267.29507: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/AnsiballZ_setup.py && sleep 0' 30582 1726855267.30044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855267.30079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855267.30098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855267.30118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855267.30135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855267.30147: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855267.30195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855267.30208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855267.30240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855267.30326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855267.30379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855267.30459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855268.15879: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.5830078125, "5m": 0.6142578125, "15m": 0.3642578125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "01", "second": "07", "epoch": "1726855267", "epoch_int": "1726855267", "date": "2024-09-20", "time": "14:01:07", "iso8601_micro": "2024-09-20T18:01:07.740935Z", "iso8601": "2024-09-20T18:01:07Z", "iso8601_basic": "20240920T140107740935", "iso8601_basic_short": "20240920T140107", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "aa:60:c4:d8:31:87", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1088:11ff:feda:7fa3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.9.244"], "ansible_all_ipv6_addresses": ["fe80::1088:11ff:feda:7fa3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.244", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::1088:11ff:feda:7fa3"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2977, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 554, "free": 2977}, "nocache": {"free": 3296, "used": 235}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_uuid": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1039, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797777408, "block_size": 4096, "block_total": 65519099, "block_available": 63915473, "block_used": 1603626, "inode_total": 131070960, "inode_available": 131029127, "inode_used": 41833, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30582 1726855268.18773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855268.18777: stdout chunk (state=3): >>><<< 30582 1726855268.18779: stderr chunk (state=3): >>><<< 30582 1726855268.18783: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.5830078125, "5m": 0.6142578125, "15m": 0.3642578125}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "01", "second": "07", "epoch": "1726855267", "epoch_int": "1726855267", "date": "2024-09-20", "time": "14:01:07", "iso8601_micro": "2024-09-20T18:01:07.740935Z", "iso8601": "2024-09-20T18:01:07Z", "iso8601_basic": "20240920T140107740935", "iso8601_basic_short": "20240920T140107", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "aa:60:c4:d8:31:87", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1088:11ff:feda:7fa3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.9.244"], "ansible_all_ipv6_addresses": ["fe80::1088:11ff:feda:7fa3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.244", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::1088:11ff:feda:7fa3"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2977, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 554, "free": 2977}, "nocache": {"free": 3296, "used": 235}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_uuid": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1039, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797777408, "block_size": 4096, "block_total": 65519099, "block_available": 63915473, "block_used": 1603626, "inode_total": 131070960, "inode_available": 131029127, "inode_used": 41833, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855268.19278: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855268.19379: _low_level_execute_command(): starting 30582 1726855268.19383: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855267.1608496-30780-61941873953820/ > /dev/null 2>&1 && sleep 0' 30582 1726855268.19927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855268.19974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855268.20049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855268.20085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855268.20103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855268.20191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855268.22784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855268.22817: stdout chunk (state=3): >>><<< 30582 1726855268.22820: stderr chunk (state=3): >>><<< 30582 1726855268.22837: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855268.22992: handler run complete 30582 1726855268.23010: variable 'ansible_facts' from source: unknown 30582 1726855268.23130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.23570: variable 'ansible_facts' from source: unknown 30582 1726855268.23786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.24055: attempt loop complete, returning result 30582 1726855268.24132: _execute() done 30582 1726855268.24139: dumping result to json 30582 1726855268.24342: done dumping result, returning 30582 1726855268.24345: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affcc66-ac2b-aa83-7d57-000000000077] 30582 1726855268.24347: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000077 ok: [managed_node3] 30582 1726855268.25520: no more pending results, returning what we have 30582 1726855268.25523: results queue empty 30582 1726855268.25524: checking for any_errors_fatal 30582 1726855268.25525: done checking for any_errors_fatal 30582 1726855268.25526: checking for max_fail_percentage 30582 1726855268.25528: done checking for max_fail_percentage 30582 1726855268.25529: checking to see if all hosts have failed and the running result is not ok 30582 1726855268.25529: done checking to see if all hosts have failed 30582 1726855268.25530: getting the remaining hosts for this loop 30582 1726855268.25531: done getting the remaining hosts for this loop 30582 1726855268.25535: getting the next task for host managed_node3 30582 1726855268.25540: done getting next task for host managed_node3 30582 1726855268.25541: ^ task is: TASK: meta (flush_handlers) 30582 1726855268.25543: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855268.25546: getting variables 30582 1726855268.25548: in VariableManager get_vars() 30582 1726855268.25569: Calling all_inventory to load vars for managed_node3 30582 1726855268.25572: Calling groups_inventory to load vars for managed_node3 30582 1726855268.25574: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.25594: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000077 30582 1726855268.25601: WORKER PROCESS EXITING 30582 1726855268.25615: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.25618: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.25621: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.25795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.26038: done with get_vars() 30582 1726855268.26052: done getting variables 30582 1726855268.26118: in VariableManager get_vars() 30582 1726855268.26126: Calling all_inventory to load vars for managed_node3 30582 1726855268.26128: Calling groups_inventory to load vars for managed_node3 30582 1726855268.26131: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.26135: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.26142: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.26145: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.26350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.26639: done with get_vars() 30582 1726855268.26652: done queuing things up, now waiting for results queue to drain 30582 1726855268.26653: results queue empty 30582 1726855268.26654: checking for any_errors_fatal 30582 1726855268.26657: done checking for any_errors_fatal 30582 1726855268.26659: checking for max_fail_percentage 30582 1726855268.26660: done checking for max_fail_percentage 30582 1726855268.26661: checking to see if all hosts have failed and the running result is not ok 30582 1726855268.26705: done checking to see if all hosts have failed 30582 1726855268.26706: getting the remaining hosts for this loop 30582 1726855268.26707: done getting the remaining hosts for this loop 30582 1726855268.26710: getting the next task for host managed_node3 30582 1726855268.26714: done getting next task for host managed_node3 30582 1726855268.26716: ^ task is: TASK: Show playbook name 30582 1726855268.26717: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855268.26719: getting variables 30582 1726855268.26720: in VariableManager get_vars() 30582 1726855268.26729: Calling all_inventory to load vars for managed_node3 30582 1726855268.26731: Calling groups_inventory to load vars for managed_node3 30582 1726855268.26733: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.26737: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.26739: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.26742: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.26922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.27136: done with get_vars() 30582 1726855268.27144: done getting variables 30582 1726855268.27217: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:11 Friday 20 September 2024 14:01:08 -0400 (0:00:01.156) 0:00:04.622 ****** 30582 1726855268.27249: entering _queue_task() for managed_node3/debug 30582 1726855268.27251: Creating lock for debug 30582 1726855268.27548: worker is 1 (out of 1 available) 30582 1726855268.27676: exiting _queue_task() for managed_node3/debug 30582 1726855268.27685: done queuing things up, now waiting for results queue to drain 30582 1726855268.27689: waiting for pending results... 30582 1726855268.27824: running TaskExecutor() for managed_node3/TASK: Show playbook name 30582 1726855268.27917: in run() - task 0affcc66-ac2b-aa83-7d57-00000000000b 30582 1726855268.27934: variable 'ansible_search_path' from source: unknown 30582 1726855268.27970: calling self._execute() 30582 1726855268.28053: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.28064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.28076: variable 'omit' from source: magic vars 30582 1726855268.29093: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.29099: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.29101: variable 'omit' from source: magic vars 30582 1726855268.29104: variable 'omit' from source: magic vars 30582 1726855268.29107: variable 'omit' from source: magic vars 30582 1726855268.29110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855268.29115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.29117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855268.29119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.29297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.29331: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.29334: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.29338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.29440: Set connection var ansible_timeout to 10 30582 1726855268.29443: Set connection var ansible_connection to ssh 30582 1726855268.29452: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.29454: Set connection var ansible_pipelining to False 30582 1726855268.29460: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.29462: Set connection var ansible_shell_type to sh 30582 1726855268.29490: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.29651: variable 'ansible_connection' from source: unknown 30582 1726855268.29900: variable 'ansible_module_compression' from source: unknown 30582 1726855268.29911: variable 'ansible_shell_type' from source: unknown 30582 1726855268.29924: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.29934: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.29943: variable 'ansible_pipelining' from source: unknown 30582 1726855268.30093: variable 'ansible_timeout' from source: unknown 30582 1726855268.30098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.30101: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.30175: variable 'omit' from source: magic vars 30582 1726855268.30186: starting attempt loop 30582 1726855268.30195: running the handler 30582 1726855268.30266: handler run complete 30582 1726855268.30303: attempt loop complete, returning result 30582 1726855268.30328: _execute() done 30582 1726855268.30335: dumping result to json 30582 1726855268.30343: done dumping result, returning 30582 1726855268.30354: done running TaskExecutor() for managed_node3/TASK: Show playbook name [0affcc66-ac2b-aa83-7d57-00000000000b] 30582 1726855268.30362: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000000b ok: [managed_node3] => {} MSG: this is: playbooks/tests_states.yml 30582 1726855268.30745: no more pending results, returning what we have 30582 1726855268.30748: results queue empty 30582 1726855268.30750: checking for any_errors_fatal 30582 1726855268.30751: done checking for any_errors_fatal 30582 1726855268.30752: checking for max_fail_percentage 30582 1726855268.30754: done checking for max_fail_percentage 30582 1726855268.30755: checking to see if all hosts have failed and the running result is not ok 30582 1726855268.30756: done checking to see if all hosts have failed 30582 1726855268.30756: getting the remaining hosts for this loop 30582 1726855268.30758: done getting the remaining hosts for this loop 30582 1726855268.30762: getting the next task for host managed_node3 30582 1726855268.30769: done getting next task for host managed_node3 30582 1726855268.30772: ^ task is: TASK: Include the task 'run_test.yml' 30582 1726855268.30774: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855268.30778: getting variables 30582 1726855268.30779: in VariableManager get_vars() 30582 1726855268.30877: Calling all_inventory to load vars for managed_node3 30582 1726855268.30880: Calling groups_inventory to load vars for managed_node3 30582 1726855268.30884: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.31139: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.31142: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.31145: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.31392: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000000b 30582 1726855268.31395: WORKER PROCESS EXITING 30582 1726855268.31421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.31644: done with get_vars() 30582 1726855268.31654: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:22 Friday 20 September 2024 14:01:08 -0400 (0:00:00.045) 0:00:04.668 ****** 30582 1726855268.31810: entering _queue_task() for managed_node3/include_tasks 30582 1726855268.32254: worker is 1 (out of 1 available) 30582 1726855268.32264: exiting _queue_task() for managed_node3/include_tasks 30582 1726855268.32274: done queuing things up, now waiting for results queue to drain 30582 1726855268.32276: waiting for pending results... 30582 1726855268.32474: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30582 1726855268.32572: in run() - task 0affcc66-ac2b-aa83-7d57-00000000000d 30582 1726855268.32591: variable 'ansible_search_path' from source: unknown 30582 1726855268.32632: calling self._execute() 30582 1726855268.32714: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.32725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.32737: variable 'omit' from source: magic vars 30582 1726855268.33110: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.33126: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.33136: _execute() done 30582 1726855268.33144: dumping result to json 30582 1726855268.33151: done dumping result, returning 30582 1726855268.33160: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcc66-ac2b-aa83-7d57-00000000000d] 30582 1726855268.33169: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000000d 30582 1726855268.33327: no more pending results, returning what we have 30582 1726855268.33332: in VariableManager get_vars() 30582 1726855268.33364: Calling all_inventory to load vars for managed_node3 30582 1726855268.33367: Calling groups_inventory to load vars for managed_node3 30582 1726855268.33371: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.33384: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.33389: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.33393: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.33808: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000000d 30582 1726855268.33811: WORKER PROCESS EXITING 30582 1726855268.33839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.34186: done with get_vars() 30582 1726855268.34265: variable 'ansible_search_path' from source: unknown 30582 1726855268.34278: we have included files to process 30582 1726855268.34280: generating all_blocks data 30582 1726855268.34281: done generating all_blocks data 30582 1726855268.34282: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855268.34283: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855268.34289: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855268.36039: in VariableManager get_vars() 30582 1726855268.36057: done with get_vars() 30582 1726855268.36103: in VariableManager get_vars() 30582 1726855268.36123: done with get_vars() 30582 1726855268.36163: in VariableManager get_vars() 30582 1726855268.36178: done with get_vars() 30582 1726855268.36238: in VariableManager get_vars() 30582 1726855268.36255: done with get_vars() 30582 1726855268.36300: in VariableManager get_vars() 30582 1726855268.36316: done with get_vars() 30582 1726855268.36699: in VariableManager get_vars() 30582 1726855268.36716: done with get_vars() 30582 1726855268.36728: done processing included file 30582 1726855268.36729: iterating over new_blocks loaded from include file 30582 1726855268.36731: in VariableManager get_vars() 30582 1726855268.36741: done with get_vars() 30582 1726855268.36743: filtering new block on tags 30582 1726855268.36880: done filtering new block on tags 30582 1726855268.36883: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30582 1726855268.36890: extending task lists for all hosts with included blocks 30582 1726855268.36927: done extending task lists 30582 1726855268.36929: done processing included files 30582 1726855268.36929: results queue empty 30582 1726855268.36930: checking for any_errors_fatal 30582 1726855268.36933: done checking for any_errors_fatal 30582 1726855268.36934: checking for max_fail_percentage 30582 1726855268.36935: done checking for max_fail_percentage 30582 1726855268.36936: checking to see if all hosts have failed and the running result is not ok 30582 1726855268.36937: done checking to see if all hosts have failed 30582 1726855268.36937: getting the remaining hosts for this loop 30582 1726855268.36939: done getting the remaining hosts for this loop 30582 1726855268.36941: getting the next task for host managed_node3 30582 1726855268.36944: done getting next task for host managed_node3 30582 1726855268.36947: ^ task is: TASK: TEST: {{ lsr_description }} 30582 1726855268.36949: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855268.36951: getting variables 30582 1726855268.36951: in VariableManager get_vars() 30582 1726855268.36959: Calling all_inventory to load vars for managed_node3 30582 1726855268.36961: Calling groups_inventory to load vars for managed_node3 30582 1726855268.36964: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.36969: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.36971: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.36974: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.37142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.37349: done with get_vars() 30582 1726855268.37359: done getting variables 30582 1726855268.37399: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855268.37511: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 14:01:08 -0400 (0:00:00.057) 0:00:04.725 ****** 30582 1726855268.37542: entering _queue_task() for managed_node3/debug 30582 1726855268.37792: worker is 1 (out of 1 available) 30582 1726855268.37807: exiting _queue_task() for managed_node3/debug 30582 1726855268.37818: done queuing things up, now waiting for results queue to drain 30582 1726855268.37819: waiting for pending results... 30582 1726855268.37970: running TaskExecutor() for managed_node3/TASK: TEST: I can create a profile 30582 1726855268.38029: in run() - task 0affcc66-ac2b-aa83-7d57-000000000091 30582 1726855268.38041: variable 'ansible_search_path' from source: unknown 30582 1726855268.38049: variable 'ansible_search_path' from source: unknown 30582 1726855268.38085: calling self._execute() 30582 1726855268.38150: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.38153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.38161: variable 'omit' from source: magic vars 30582 1726855268.38434: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.38445: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.38450: variable 'omit' from source: magic vars 30582 1726855268.38474: variable 'omit' from source: magic vars 30582 1726855268.38548: variable 'lsr_description' from source: include params 30582 1726855268.38561: variable 'omit' from source: magic vars 30582 1726855268.38594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855268.38627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.38643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855268.38656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.38666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.38690: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.38693: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.38696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.38767: Set connection var ansible_timeout to 10 30582 1726855268.38771: Set connection var ansible_connection to ssh 30582 1726855268.38776: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.38781: Set connection var ansible_pipelining to False 30582 1726855268.38786: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.38790: Set connection var ansible_shell_type to sh 30582 1726855268.38809: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.38814: variable 'ansible_connection' from source: unknown 30582 1726855268.38816: variable 'ansible_module_compression' from source: unknown 30582 1726855268.38820: variable 'ansible_shell_type' from source: unknown 30582 1726855268.38823: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.38825: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.38827: variable 'ansible_pipelining' from source: unknown 30582 1726855268.38830: variable 'ansible_timeout' from source: unknown 30582 1726855268.38832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.38930: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.38938: variable 'omit' from source: magic vars 30582 1726855268.38948: starting attempt loop 30582 1726855268.38952: running the handler 30582 1726855268.38982: handler run complete 30582 1726855268.38994: attempt loop complete, returning result 30582 1726855268.38997: _execute() done 30582 1726855268.39000: dumping result to json 30582 1726855268.39005: done dumping result, returning 30582 1726855268.39012: done running TaskExecutor() for managed_node3/TASK: TEST: I can create a profile [0affcc66-ac2b-aa83-7d57-000000000091] 30582 1726855268.39016: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000091 30582 1726855268.39099: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000091 30582 1726855268.39102: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I can create a profile ########## 30582 1726855268.39147: no more pending results, returning what we have 30582 1726855268.39151: results queue empty 30582 1726855268.39152: checking for any_errors_fatal 30582 1726855268.39153: done checking for any_errors_fatal 30582 1726855268.39153: checking for max_fail_percentage 30582 1726855268.39155: done checking for max_fail_percentage 30582 1726855268.39156: checking to see if all hosts have failed and the running result is not ok 30582 1726855268.39156: done checking to see if all hosts have failed 30582 1726855268.39157: getting the remaining hosts for this loop 30582 1726855268.39160: done getting the remaining hosts for this loop 30582 1726855268.39166: getting the next task for host managed_node3 30582 1726855268.39172: done getting next task for host managed_node3 30582 1726855268.39174: ^ task is: TASK: Show item 30582 1726855268.39178: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855268.39181: getting variables 30582 1726855268.39183: in VariableManager get_vars() 30582 1726855268.39212: Calling all_inventory to load vars for managed_node3 30582 1726855268.39215: Calling groups_inventory to load vars for managed_node3 30582 1726855268.39217: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.39228: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.39231: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.39234: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.39470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.39702: done with get_vars() 30582 1726855268.39712: done getting variables 30582 1726855268.39777: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 14:01:08 -0400 (0:00:00.022) 0:00:04.748 ****** 30582 1726855268.39890: entering _queue_task() for managed_node3/debug 30582 1726855268.40363: worker is 1 (out of 1 available) 30582 1726855268.40374: exiting _queue_task() for managed_node3/debug 30582 1726855268.40505: done queuing things up, now waiting for results queue to drain 30582 1726855268.40507: waiting for pending results... 30582 1726855268.40732: running TaskExecutor() for managed_node3/TASK: Show item 30582 1726855268.40829: in run() - task 0affcc66-ac2b-aa83-7d57-000000000092 30582 1726855268.40833: variable 'ansible_search_path' from source: unknown 30582 1726855268.40836: variable 'ansible_search_path' from source: unknown 30582 1726855268.40913: variable 'omit' from source: magic vars 30582 1726855268.41059: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.41072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.41100: variable 'omit' from source: magic vars 30582 1726855268.41357: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.41373: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.41376: variable 'omit' from source: magic vars 30582 1726855268.41402: variable 'omit' from source: magic vars 30582 1726855268.41434: variable 'item' from source: unknown 30582 1726855268.41490: variable 'item' from source: unknown 30582 1726855268.41506: variable 'omit' from source: magic vars 30582 1726855268.41537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855268.41565: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.41580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855268.41596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.41608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.41629: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.41633: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.41635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.41709: Set connection var ansible_timeout to 10 30582 1726855268.41712: Set connection var ansible_connection to ssh 30582 1726855268.41718: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.41723: Set connection var ansible_pipelining to False 30582 1726855268.41727: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.41730: Set connection var ansible_shell_type to sh 30582 1726855268.41743: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.41746: variable 'ansible_connection' from source: unknown 30582 1726855268.41748: variable 'ansible_module_compression' from source: unknown 30582 1726855268.41751: variable 'ansible_shell_type' from source: unknown 30582 1726855268.41753: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.41755: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.41759: variable 'ansible_pipelining' from source: unknown 30582 1726855268.41762: variable 'ansible_timeout' from source: unknown 30582 1726855268.41766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.41868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.41876: variable 'omit' from source: magic vars 30582 1726855268.41881: starting attempt loop 30582 1726855268.41886: running the handler 30582 1726855268.41927: variable 'lsr_description' from source: include params 30582 1726855268.41971: variable 'lsr_description' from source: include params 30582 1726855268.41978: handler run complete 30582 1726855268.41993: attempt loop complete, returning result 30582 1726855268.42008: variable 'item' from source: unknown 30582 1726855268.42052: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile" } 30582 1726855268.42185: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.42190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.42193: variable 'omit' from source: magic vars 30582 1726855268.42263: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.42266: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.42271: variable 'omit' from source: magic vars 30582 1726855268.42281: variable 'omit' from source: magic vars 30582 1726855268.42315: variable 'item' from source: unknown 30582 1726855268.42356: variable 'item' from source: unknown 30582 1726855268.42366: variable 'omit' from source: magic vars 30582 1726855268.42380: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.42386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.42394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.42406: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.42409: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.42413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.42460: Set connection var ansible_timeout to 10 30582 1726855268.42463: Set connection var ansible_connection to ssh 30582 1726855268.42468: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.42473: Set connection var ansible_pipelining to False 30582 1726855268.42478: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.42480: Set connection var ansible_shell_type to sh 30582 1726855268.42495: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.42498: variable 'ansible_connection' from source: unknown 30582 1726855268.42503: variable 'ansible_module_compression' from source: unknown 30582 1726855268.42505: variable 'ansible_shell_type' from source: unknown 30582 1726855268.42507: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.42509: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.42514: variable 'ansible_pipelining' from source: unknown 30582 1726855268.42516: variable 'ansible_timeout' from source: unknown 30582 1726855268.42524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.42605: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.42637: variable 'omit' from source: magic vars 30582 1726855268.42714: starting attempt loop 30582 1726855268.42720: running the handler 30582 1726855268.42723: variable 'lsr_setup' from source: include params 30582 1726855268.42741: variable 'lsr_setup' from source: include params 30582 1726855268.42844: handler run complete 30582 1726855268.42847: attempt loop complete, returning result 30582 1726855268.42850: variable 'item' from source: unknown 30582 1726855268.42903: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30582 1726855268.43171: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.43173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.43178: variable 'omit' from source: magic vars 30582 1726855268.43266: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.43292: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.43383: variable 'omit' from source: magic vars 30582 1726855268.43386: variable 'omit' from source: magic vars 30582 1726855268.43391: variable 'item' from source: unknown 30582 1726855268.43431: variable 'item' from source: unknown 30582 1726855268.43449: variable 'omit' from source: magic vars 30582 1726855268.43469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.43800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.43803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.43805: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.43807: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.43809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.43811: Set connection var ansible_timeout to 10 30582 1726855268.43813: Set connection var ansible_connection to ssh 30582 1726855268.43815: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.43816: Set connection var ansible_pipelining to False 30582 1726855268.43827: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.43833: Set connection var ansible_shell_type to sh 30582 1726855268.43855: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.43862: variable 'ansible_connection' from source: unknown 30582 1726855268.43868: variable 'ansible_module_compression' from source: unknown 30582 1726855268.43874: variable 'ansible_shell_type' from source: unknown 30582 1726855268.43880: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.43894: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.43913: variable 'ansible_pipelining' from source: unknown 30582 1726855268.43920: variable 'ansible_timeout' from source: unknown 30582 1726855268.43927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.44103: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.44137: variable 'omit' from source: magic vars 30582 1726855268.44175: starting attempt loop 30582 1726855268.44183: running the handler 30582 1726855268.44211: variable 'lsr_test' from source: include params 30582 1726855268.44561: variable 'lsr_test' from source: include params 30582 1726855268.44564: handler run complete 30582 1726855268.44566: attempt loop complete, returning result 30582 1726855268.44594: variable 'item' from source: unknown 30582 1726855268.44661: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile.yml" ] } 30582 1726855268.45084: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.45090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.45092: variable 'omit' from source: magic vars 30582 1726855268.45204: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.45246: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.45251: variable 'omit' from source: magic vars 30582 1726855268.45254: variable 'omit' from source: magic vars 30582 1726855268.45295: variable 'item' from source: unknown 30582 1726855268.45362: variable 'item' from source: unknown 30582 1726855268.45492: variable 'omit' from source: magic vars 30582 1726855268.45495: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.45501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.45503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.45505: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.45507: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.45509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.45534: Set connection var ansible_timeout to 10 30582 1726855268.45542: Set connection var ansible_connection to ssh 30582 1726855268.45554: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.45564: Set connection var ansible_pipelining to False 30582 1726855268.45573: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.45579: Set connection var ansible_shell_type to sh 30582 1726855268.45607: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.45615: variable 'ansible_connection' from source: unknown 30582 1726855268.45622: variable 'ansible_module_compression' from source: unknown 30582 1726855268.45628: variable 'ansible_shell_type' from source: unknown 30582 1726855268.45635: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.45642: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.45650: variable 'ansible_pipelining' from source: unknown 30582 1726855268.45657: variable 'ansible_timeout' from source: unknown 30582 1726855268.45664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.45758: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.45772: variable 'omit' from source: magic vars 30582 1726855268.45781: starting attempt loop 30582 1726855268.45789: running the handler 30582 1726855268.45816: variable 'lsr_assert' from source: include params 30582 1726855268.45878: variable 'lsr_assert' from source: include params 30582 1726855268.45904: handler run complete 30582 1726855268.45991: attempt loop complete, returning result 30582 1726855268.45995: variable 'item' from source: unknown 30582 1726855268.46000: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_present.yml" ] } 30582 1726855268.46293: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.46300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.46302: variable 'omit' from source: magic vars 30582 1726855268.46323: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.46334: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.46347: variable 'omit' from source: magic vars 30582 1726855268.46366: variable 'omit' from source: magic vars 30582 1726855268.46413: variable 'item' from source: unknown 30582 1726855268.46476: variable 'item' from source: unknown 30582 1726855268.46499: variable 'omit' from source: magic vars 30582 1726855268.46522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.46534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.46544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.46558: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.46564: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.46570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.46645: Set connection var ansible_timeout to 10 30582 1726855268.46652: Set connection var ansible_connection to ssh 30582 1726855268.46670: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.46717: Set connection var ansible_pipelining to False 30582 1726855268.46727: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.46732: Set connection var ansible_shell_type to sh 30582 1726855268.46784: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.46833: variable 'ansible_connection' from source: unknown 30582 1726855268.46837: variable 'ansible_module_compression' from source: unknown 30582 1726855268.46839: variable 'ansible_shell_type' from source: unknown 30582 1726855268.46841: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.46843: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.46845: variable 'ansible_pipelining' from source: unknown 30582 1726855268.46847: variable 'ansible_timeout' from source: unknown 30582 1726855268.46849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.47007: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.47031: variable 'omit' from source: magic vars 30582 1726855268.47038: starting attempt loop 30582 1726855268.47066: running the handler 30582 1726855268.47292: variable 'lsr_assert_when' from source: include params 30582 1726855268.47297: variable 'lsr_assert_when' from source: include params 30582 1726855268.47301: variable 'network_provider' from source: set_fact 30582 1726855268.47304: handler run complete 30582 1726855268.47306: attempt loop complete, returning result 30582 1726855268.47309: variable 'item' from source: unknown 30582 1726855268.47397: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_present.yml" } ] } 30582 1726855268.47553: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.47560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.47564: variable 'omit' from source: magic vars 30582 1726855268.47800: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.47803: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.47805: variable 'omit' from source: magic vars 30582 1726855268.47807: variable 'omit' from source: magic vars 30582 1726855268.47809: variable 'item' from source: unknown 30582 1726855268.47847: variable 'item' from source: unknown 30582 1726855268.47865: variable 'omit' from source: magic vars 30582 1726855268.47886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.47908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.47939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.48012: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.48015: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.48017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.48056: Set connection var ansible_timeout to 10 30582 1726855268.48063: Set connection var ansible_connection to ssh 30582 1726855268.48073: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.48081: Set connection var ansible_pipelining to False 30582 1726855268.48117: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.48120: Set connection var ansible_shell_type to sh 30582 1726855268.48122: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.48131: variable 'ansible_connection' from source: unknown 30582 1726855268.48145: variable 'ansible_module_compression' from source: unknown 30582 1726855268.48148: variable 'ansible_shell_type' from source: unknown 30582 1726855268.48156: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.48158: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.48163: variable 'ansible_pipelining' from source: unknown 30582 1726855268.48165: variable 'ansible_timeout' from source: unknown 30582 1726855268.48169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.48238: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.48242: variable 'omit' from source: magic vars 30582 1726855268.48244: starting attempt loop 30582 1726855268.48247: running the handler 30582 1726855268.48273: variable 'lsr_fail_debug' from source: play vars 30582 1726855268.48340: variable 'lsr_fail_debug' from source: play vars 30582 1726855268.48353: handler run complete 30582 1726855268.48362: attempt loop complete, returning result 30582 1726855268.48376: variable 'item' from source: unknown 30582 1726855268.48436: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30582 1726855268.48520: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.48523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.48526: variable 'omit' from source: magic vars 30582 1726855268.48796: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.48800: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.48802: variable 'omit' from source: magic vars 30582 1726855268.48804: variable 'omit' from source: magic vars 30582 1726855268.48806: variable 'item' from source: unknown 30582 1726855268.48813: variable 'item' from source: unknown 30582 1726855268.48830: variable 'omit' from source: magic vars 30582 1726855268.48851: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.48866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.49012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.49015: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.49017: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.49020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.49123: Set connection var ansible_timeout to 10 30582 1726855268.49126: Set connection var ansible_connection to ssh 30582 1726855268.49129: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.49131: Set connection var ansible_pipelining to False 30582 1726855268.49133: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.49135: Set connection var ansible_shell_type to sh 30582 1726855268.49242: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.49251: variable 'ansible_connection' from source: unknown 30582 1726855268.49257: variable 'ansible_module_compression' from source: unknown 30582 1726855268.49264: variable 'ansible_shell_type' from source: unknown 30582 1726855268.49270: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.49276: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.49283: variable 'ansible_pipelining' from source: unknown 30582 1726855268.49292: variable 'ansible_timeout' from source: unknown 30582 1726855268.49300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.49598: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.49601: variable 'omit' from source: magic vars 30582 1726855268.49604: starting attempt loop 30582 1726855268.49606: running the handler 30582 1726855268.49626: variable 'lsr_cleanup' from source: include params 30582 1726855268.49698: variable 'lsr_cleanup' from source: include params 30582 1726855268.49895: handler run complete 30582 1726855268.49898: attempt loop complete, returning result 30582 1726855268.49901: variable 'item' from source: unknown 30582 1726855268.49958: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30582 1726855268.50359: dumping result to json 30582 1726855268.50362: done dumping result, returning 30582 1726855268.50364: done running TaskExecutor() for managed_node3/TASK: Show item [0affcc66-ac2b-aa83-7d57-000000000092] 30582 1726855268.50366: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000092 30582 1726855268.50413: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000092 30582 1726855268.50416: WORKER PROCESS EXITING 30582 1726855268.50463: no more pending results, returning what we have 30582 1726855268.50467: results queue empty 30582 1726855268.50468: checking for any_errors_fatal 30582 1726855268.50473: done checking for any_errors_fatal 30582 1726855268.50474: checking for max_fail_percentage 30582 1726855268.50475: done checking for max_fail_percentage 30582 1726855268.50476: checking to see if all hosts have failed and the running result is not ok 30582 1726855268.50477: done checking to see if all hosts have failed 30582 1726855268.50477: getting the remaining hosts for this loop 30582 1726855268.50479: done getting the remaining hosts for this loop 30582 1726855268.50482: getting the next task for host managed_node3 30582 1726855268.50491: done getting next task for host managed_node3 30582 1726855268.50493: ^ task is: TASK: Include the task 'show_interfaces.yml' 30582 1726855268.50498: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855268.50501: getting variables 30582 1726855268.50503: in VariableManager get_vars() 30582 1726855268.50529: Calling all_inventory to load vars for managed_node3 30582 1726855268.50532: Calling groups_inventory to load vars for managed_node3 30582 1726855268.50535: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.50546: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.50548: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.50550: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.50854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.51052: done with get_vars() 30582 1726855268.51063: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 14:01:08 -0400 (0:00:00.113) 0:00:04.861 ****** 30582 1726855268.51159: entering _queue_task() for managed_node3/include_tasks 30582 1726855268.51439: worker is 1 (out of 1 available) 30582 1726855268.51457: exiting _queue_task() for managed_node3/include_tasks 30582 1726855268.51466: done queuing things up, now waiting for results queue to drain 30582 1726855268.51468: waiting for pending results... 30582 1726855268.51669: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30582 1726855268.51779: in run() - task 0affcc66-ac2b-aa83-7d57-000000000093 30582 1726855268.51800: variable 'ansible_search_path' from source: unknown 30582 1726855268.51807: variable 'ansible_search_path' from source: unknown 30582 1726855268.51845: calling self._execute() 30582 1726855268.51926: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.51978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.51982: variable 'omit' from source: magic vars 30582 1726855268.52331: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.52349: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.52360: _execute() done 30582 1726855268.52367: dumping result to json 30582 1726855268.52375: done dumping result, returning 30582 1726855268.52385: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcc66-ac2b-aa83-7d57-000000000093] 30582 1726855268.52411: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000093 30582 1726855268.52549: no more pending results, returning what we have 30582 1726855268.52555: in VariableManager get_vars() 30582 1726855268.52593: Calling all_inventory to load vars for managed_node3 30582 1726855268.52597: Calling groups_inventory to load vars for managed_node3 30582 1726855268.52601: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.52616: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.52620: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.52623: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.53143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.53474: done with get_vars() 30582 1726855268.53482: variable 'ansible_search_path' from source: unknown 30582 1726855268.53484: variable 'ansible_search_path' from source: unknown 30582 1726855268.53501: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000093 30582 1726855268.53504: WORKER PROCESS EXITING 30582 1726855268.53543: we have included files to process 30582 1726855268.53545: generating all_blocks data 30582 1726855268.53546: done generating all_blocks data 30582 1726855268.53551: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855268.53552: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855268.53554: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855268.53706: in VariableManager get_vars() 30582 1726855268.53725: done with get_vars() 30582 1726855268.53847: done processing included file 30582 1726855268.53849: iterating over new_blocks loaded from include file 30582 1726855268.53851: in VariableManager get_vars() 30582 1726855268.53863: done with get_vars() 30582 1726855268.53865: filtering new block on tags 30582 1726855268.53901: done filtering new block on tags 30582 1726855268.53903: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30582 1726855268.53928: extending task lists for all hosts with included blocks 30582 1726855268.54227: done extending task lists 30582 1726855268.54229: done processing included files 30582 1726855268.54229: results queue empty 30582 1726855268.54230: checking for any_errors_fatal 30582 1726855268.54233: done checking for any_errors_fatal 30582 1726855268.54234: checking for max_fail_percentage 30582 1726855268.54234: done checking for max_fail_percentage 30582 1726855268.54235: checking to see if all hosts have failed and the running result is not ok 30582 1726855268.54235: done checking to see if all hosts have failed 30582 1726855268.54236: getting the remaining hosts for this loop 30582 1726855268.54237: done getting the remaining hosts for this loop 30582 1726855268.54238: getting the next task for host managed_node3 30582 1726855268.54241: done getting next task for host managed_node3 30582 1726855268.54242: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30582 1726855268.54244: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855268.54246: getting variables 30582 1726855268.54247: in VariableManager get_vars() 30582 1726855268.54252: Calling all_inventory to load vars for managed_node3 30582 1726855268.54254: Calling groups_inventory to load vars for managed_node3 30582 1726855268.54255: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.54259: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.54260: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.54262: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.54353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.54474: done with get_vars() 30582 1726855268.54480: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 14:01:08 -0400 (0:00:00.033) 0:00:04.895 ****** 30582 1726855268.54535: entering _queue_task() for managed_node3/include_tasks 30582 1726855268.54736: worker is 1 (out of 1 available) 30582 1726855268.54750: exiting _queue_task() for managed_node3/include_tasks 30582 1726855268.54761: done queuing things up, now waiting for results queue to drain 30582 1726855268.54763: waiting for pending results... 30582 1726855268.54919: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30582 1726855268.54972: in run() - task 0affcc66-ac2b-aa83-7d57-0000000000ba 30582 1726855268.54985: variable 'ansible_search_path' from source: unknown 30582 1726855268.54992: variable 'ansible_search_path' from source: unknown 30582 1726855268.55021: calling self._execute() 30582 1726855268.55070: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.55073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.55082: variable 'omit' from source: magic vars 30582 1726855268.55360: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.55384: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.55392: _execute() done 30582 1726855268.55493: dumping result to json 30582 1726855268.55499: done dumping result, returning 30582 1726855268.55502: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcc66-ac2b-aa83-7d57-0000000000ba] 30582 1726855268.55504: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000000ba 30582 1726855268.55567: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000000ba 30582 1726855268.55570: WORKER PROCESS EXITING 30582 1726855268.55615: no more pending results, returning what we have 30582 1726855268.55620: in VariableManager get_vars() 30582 1726855268.55650: Calling all_inventory to load vars for managed_node3 30582 1726855268.55653: Calling groups_inventory to load vars for managed_node3 30582 1726855268.55656: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.55667: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.55670: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.55672: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.55885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.56103: done with get_vars() 30582 1726855268.56111: variable 'ansible_search_path' from source: unknown 30582 1726855268.56112: variable 'ansible_search_path' from source: unknown 30582 1726855268.56145: we have included files to process 30582 1726855268.56146: generating all_blocks data 30582 1726855268.56148: done generating all_blocks data 30582 1726855268.56149: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855268.56150: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855268.56152: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855268.56461: done processing included file 30582 1726855268.56463: iterating over new_blocks loaded from include file 30582 1726855268.56464: in VariableManager get_vars() 30582 1726855268.56478: done with get_vars() 30582 1726855268.56479: filtering new block on tags 30582 1726855268.56524: done filtering new block on tags 30582 1726855268.56526: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30582 1726855268.56531: extending task lists for all hosts with included blocks 30582 1726855268.56700: done extending task lists 30582 1726855268.56702: done processing included files 30582 1726855268.56703: results queue empty 30582 1726855268.56703: checking for any_errors_fatal 30582 1726855268.56707: done checking for any_errors_fatal 30582 1726855268.56708: checking for max_fail_percentage 30582 1726855268.56709: done checking for max_fail_percentage 30582 1726855268.56709: checking to see if all hosts have failed and the running result is not ok 30582 1726855268.56710: done checking to see if all hosts have failed 30582 1726855268.56710: getting the remaining hosts for this loop 30582 1726855268.56711: done getting the remaining hosts for this loop 30582 1726855268.56713: getting the next task for host managed_node3 30582 1726855268.56716: done getting next task for host managed_node3 30582 1726855268.56717: ^ task is: TASK: Gather current interface info 30582 1726855268.56720: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855268.56721: getting variables 30582 1726855268.56722: in VariableManager get_vars() 30582 1726855268.56732: Calling all_inventory to load vars for managed_node3 30582 1726855268.56734: Calling groups_inventory to load vars for managed_node3 30582 1726855268.56736: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855268.56745: Calling all_plugins_play to load vars for managed_node3 30582 1726855268.56748: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855268.56751: Calling groups_plugins_play to load vars for managed_node3 30582 1726855268.56868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855268.56999: done with get_vars() 30582 1726855268.57006: done getting variables 30582 1726855268.57031: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 14:01:08 -0400 (0:00:00.025) 0:00:04.920 ****** 30582 1726855268.57050: entering _queue_task() for managed_node3/command 30582 1726855268.57246: worker is 1 (out of 1 available) 30582 1726855268.57260: exiting _queue_task() for managed_node3/command 30582 1726855268.57271: done queuing things up, now waiting for results queue to drain 30582 1726855268.57273: waiting for pending results... 30582 1726855268.57408: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30582 1726855268.57475: in run() - task 0affcc66-ac2b-aa83-7d57-0000000000f5 30582 1726855268.57486: variable 'ansible_search_path' from source: unknown 30582 1726855268.57491: variable 'ansible_search_path' from source: unknown 30582 1726855268.57526: calling self._execute() 30582 1726855268.57574: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.57578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.57586: variable 'omit' from source: magic vars 30582 1726855268.57847: variable 'ansible_distribution_major_version' from source: facts 30582 1726855268.57856: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855268.57863: variable 'omit' from source: magic vars 30582 1726855268.57902: variable 'omit' from source: magic vars 30582 1726855268.57927: variable 'omit' from source: magic vars 30582 1726855268.57960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855268.57985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855268.58004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855268.58017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.58027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855268.58052: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855268.58055: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.58057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.58128: Set connection var ansible_timeout to 10 30582 1726855268.58131: Set connection var ansible_connection to ssh 30582 1726855268.58137: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855268.58141: Set connection var ansible_pipelining to False 30582 1726855268.58148: Set connection var ansible_shell_executable to /bin/sh 30582 1726855268.58150: Set connection var ansible_shell_type to sh 30582 1726855268.58168: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.58171: variable 'ansible_connection' from source: unknown 30582 1726855268.58174: variable 'ansible_module_compression' from source: unknown 30582 1726855268.58177: variable 'ansible_shell_type' from source: unknown 30582 1726855268.58179: variable 'ansible_shell_executable' from source: unknown 30582 1726855268.58181: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855268.58183: variable 'ansible_pipelining' from source: unknown 30582 1726855268.58185: variable 'ansible_timeout' from source: unknown 30582 1726855268.58192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855268.58292: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855268.58304: variable 'omit' from source: magic vars 30582 1726855268.58339: starting attempt loop 30582 1726855268.58342: running the handler 30582 1726855268.58345: _low_level_execute_command(): starting 30582 1726855268.58347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855268.59049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855268.59122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855268.59183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855268.61025: stdout chunk (state=3): >>>/root <<< 30582 1726855268.61162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855268.61194: stderr chunk (state=3): >>><<< 30582 1726855268.61197: stdout chunk (state=3): >>><<< 30582 1726855268.61220: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855268.61231: _low_level_execute_command(): starting 30582 1726855268.61238: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152 `" && echo ansible-tmp-1726855268.6122046-30851-114235334415152="` echo /root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152 `" ) && sleep 0' 30582 1726855268.61677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855268.61681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855268.61683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855268.61686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855268.61696: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855268.61740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855268.61812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855268.64189: stdout chunk (state=3): >>>ansible-tmp-1726855268.6122046-30851-114235334415152=/root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152 <<< 30582 1726855268.64344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855268.64369: stderr chunk (state=3): >>><<< 30582 1726855268.64372: stdout chunk (state=3): >>><<< 30582 1726855268.64386: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855268.6122046-30851-114235334415152=/root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855268.64426: variable 'ansible_module_compression' from source: unknown 30582 1726855268.64463: ANSIBALLZ: Using generic lock for ansible.legacy.command 30582 1726855268.64466: ANSIBALLZ: Acquiring lock 30582 1726855268.64469: ANSIBALLZ: Lock acquired: 140270807060400 30582 1726855268.64471: ANSIBALLZ: Creating module 30582 1726855268.72220: ANSIBALLZ: Writing module into payload 30582 1726855268.72281: ANSIBALLZ: Writing module 30582 1726855268.72302: ANSIBALLZ: Renaming module 30582 1726855268.72307: ANSIBALLZ: Done creating module 30582 1726855268.72321: variable 'ansible_facts' from source: unknown 30582 1726855268.72365: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/AnsiballZ_command.py 30582 1726855268.72467: Sending initial data 30582 1726855268.72470: Sent initial data (156 bytes) 30582 1726855268.72926: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855268.72931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855268.72953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855268.72993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855268.73009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855268.73081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855268.75309: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855268.75370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855268.75434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpbf123js8 /root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/AnsiballZ_command.py <<< 30582 1726855268.75438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/AnsiballZ_command.py" <<< 30582 1726855268.75496: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpbf123js8" to remote "/root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/AnsiballZ_command.py" <<< 30582 1726855268.75499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/AnsiballZ_command.py" <<< 30582 1726855268.76115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855268.76158: stderr chunk (state=3): >>><<< 30582 1726855268.76161: stdout chunk (state=3): >>><<< 30582 1726855268.76192: done transferring module to remote 30582 1726855268.76203: _low_level_execute_command(): starting 30582 1726855268.76207: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/ /root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/AnsiballZ_command.py && sleep 0' 30582 1726855268.76660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855268.76663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855268.76666: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855268.76668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855268.76670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855268.76672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855268.76727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855268.76734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855268.76797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855268.79364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855268.79389: stderr chunk (state=3): >>><<< 30582 1726855268.79393: stdout chunk (state=3): >>><<< 30582 1726855268.79412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30582 1726855268.79416: _low_level_execute_command(): starting 30582 1726855268.79418: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/AnsiballZ_command.py && sleep 0' 30582 1726855268.79857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855268.79860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855268.79862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855268.79865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855268.79866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855268.79916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855268.79919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855268.79992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30582 1726855268.98632: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:01:08.981851", "end": "2024-09-20 14:01:08.985215", "delta": "0:00:00.003364", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855269.00121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855269.00144: stderr chunk (state=3): >>><<< 30582 1726855269.00149: stdout chunk (state=3): >>><<< 30582 1726855269.00168: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:01:08.981851", "end": "2024-09-20 14:01:08.985215", "delta": "0:00:00.003364", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855269.00200: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855269.00204: _low_level_execute_command(): starting 30582 1726855269.00209: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855268.6122046-30851-114235334415152/ > /dev/null 2>&1 && sleep 0' 30582 1726855269.00750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855269.00754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855269.00756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.00758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855269.00764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855269.00766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.00817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855269.00824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855269.00829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.00879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.02691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.02717: stderr chunk (state=3): >>><<< 30582 1726855269.02721: stdout chunk (state=3): >>><<< 30582 1726855269.02734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855269.02739: handler run complete 30582 1726855269.02760: Evaluated conditional (False): False 30582 1726855269.02771: attempt loop complete, returning result 30582 1726855269.02773: _execute() done 30582 1726855269.02776: dumping result to json 30582 1726855269.02778: done dumping result, returning 30582 1726855269.02789: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcc66-ac2b-aa83-7d57-0000000000f5] 30582 1726855269.02794: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000000f5 30582 1726855269.02889: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000000f5 30582 1726855269.02892: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003364", "end": "2024-09-20 14:01:08.985215", "rc": 0, "start": "2024-09-20 14:01:08.981851" } STDOUT: bonding_masters eth0 lo rpltstbr 30582 1726855269.02959: no more pending results, returning what we have 30582 1726855269.02963: results queue empty 30582 1726855269.02964: checking for any_errors_fatal 30582 1726855269.02965: done checking for any_errors_fatal 30582 1726855269.02966: checking for max_fail_percentage 30582 1726855269.02967: done checking for max_fail_percentage 30582 1726855269.02968: checking to see if all hosts have failed and the running result is not ok 30582 1726855269.02969: done checking to see if all hosts have failed 30582 1726855269.02969: getting the remaining hosts for this loop 30582 1726855269.02971: done getting the remaining hosts for this loop 30582 1726855269.02974: getting the next task for host managed_node3 30582 1726855269.02981: done getting next task for host managed_node3 30582 1726855269.02983: ^ task is: TASK: Set current_interfaces 30582 1726855269.02989: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855269.02993: getting variables 30582 1726855269.02994: in VariableManager get_vars() 30582 1726855269.03025: Calling all_inventory to load vars for managed_node3 30582 1726855269.03027: Calling groups_inventory to load vars for managed_node3 30582 1726855269.03030: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.03041: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.03043: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.03046: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.03219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.03346: done with get_vars() 30582 1726855269.03354: done getting variables 30582 1726855269.03398: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 14:01:09 -0400 (0:00:00.463) 0:00:05.384 ****** 30582 1726855269.03427: entering _queue_task() for managed_node3/set_fact 30582 1726855269.03638: worker is 1 (out of 1 available) 30582 1726855269.03651: exiting _queue_task() for managed_node3/set_fact 30582 1726855269.03662: done queuing things up, now waiting for results queue to drain 30582 1726855269.03664: waiting for pending results... 30582 1726855269.03811: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30582 1726855269.03875: in run() - task 0affcc66-ac2b-aa83-7d57-0000000000f6 30582 1726855269.03893: variable 'ansible_search_path' from source: unknown 30582 1726855269.03897: variable 'ansible_search_path' from source: unknown 30582 1726855269.03925: calling self._execute() 30582 1726855269.03978: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.03982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.03990: variable 'omit' from source: magic vars 30582 1726855269.04309: variable 'ansible_distribution_major_version' from source: facts 30582 1726855269.04319: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855269.04326: variable 'omit' from source: magic vars 30582 1726855269.04362: variable 'omit' from source: magic vars 30582 1726855269.04436: variable '_current_interfaces' from source: set_fact 30582 1726855269.04592: variable 'omit' from source: magic vars 30582 1726855269.04597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855269.04601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855269.04603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855269.04624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.04640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.04673: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855269.04681: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.04692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.04799: Set connection var ansible_timeout to 10 30582 1726855269.04808: Set connection var ansible_connection to ssh 30582 1726855269.04822: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855269.04831: Set connection var ansible_pipelining to False 30582 1726855269.04840: Set connection var ansible_shell_executable to /bin/sh 30582 1726855269.04847: Set connection var ansible_shell_type to sh 30582 1726855269.04871: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.04878: variable 'ansible_connection' from source: unknown 30582 1726855269.04885: variable 'ansible_module_compression' from source: unknown 30582 1726855269.04894: variable 'ansible_shell_type' from source: unknown 30582 1726855269.04905: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.04910: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.05092: variable 'ansible_pipelining' from source: unknown 30582 1726855269.05097: variable 'ansible_timeout' from source: unknown 30582 1726855269.05100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.05103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855269.05105: variable 'omit' from source: magic vars 30582 1726855269.05107: starting attempt loop 30582 1726855269.05109: running the handler 30582 1726855269.05111: handler run complete 30582 1726855269.05117: attempt loop complete, returning result 30582 1726855269.05123: _execute() done 30582 1726855269.05128: dumping result to json 30582 1726855269.05136: done dumping result, returning 30582 1726855269.05148: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcc66-ac2b-aa83-7d57-0000000000f6] 30582 1726855269.05157: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000000f6 ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30582 1726855269.05308: no more pending results, returning what we have 30582 1726855269.05311: results queue empty 30582 1726855269.05312: checking for any_errors_fatal 30582 1726855269.05319: done checking for any_errors_fatal 30582 1726855269.05319: checking for max_fail_percentage 30582 1726855269.05321: done checking for max_fail_percentage 30582 1726855269.05322: checking to see if all hosts have failed and the running result is not ok 30582 1726855269.05322: done checking to see if all hosts have failed 30582 1726855269.05323: getting the remaining hosts for this loop 30582 1726855269.05324: done getting the remaining hosts for this loop 30582 1726855269.05328: getting the next task for host managed_node3 30582 1726855269.05335: done getting next task for host managed_node3 30582 1726855269.05338: ^ task is: TASK: Show current_interfaces 30582 1726855269.05342: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855269.05346: getting variables 30582 1726855269.05347: in VariableManager get_vars() 30582 1726855269.05383: Calling all_inventory to load vars for managed_node3 30582 1726855269.05386: Calling groups_inventory to load vars for managed_node3 30582 1726855269.05391: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.05403: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.05405: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.05408: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.05715: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000000f6 30582 1726855269.05718: WORKER PROCESS EXITING 30582 1726855269.05740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.05951: done with get_vars() 30582 1726855269.05961: done getting variables 30582 1726855269.06029: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 14:01:09 -0400 (0:00:00.026) 0:00:05.410 ****** 30582 1726855269.06059: entering _queue_task() for managed_node3/debug 30582 1726855269.06340: worker is 1 (out of 1 available) 30582 1726855269.06353: exiting _queue_task() for managed_node3/debug 30582 1726855269.06364: done queuing things up, now waiting for results queue to drain 30582 1726855269.06365: waiting for pending results... 30582 1726855269.06536: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30582 1726855269.06619: in run() - task 0affcc66-ac2b-aa83-7d57-0000000000bb 30582 1726855269.06631: variable 'ansible_search_path' from source: unknown 30582 1726855269.06634: variable 'ansible_search_path' from source: unknown 30582 1726855269.06660: calling self._execute() 30582 1726855269.06721: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.06724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.06732: variable 'omit' from source: magic vars 30582 1726855269.06993: variable 'ansible_distribution_major_version' from source: facts 30582 1726855269.07006: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855269.07012: variable 'omit' from source: magic vars 30582 1726855269.07044: variable 'omit' from source: magic vars 30582 1726855269.07117: variable 'current_interfaces' from source: set_fact 30582 1726855269.07140: variable 'omit' from source: magic vars 30582 1726855269.07169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855269.07197: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855269.07216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855269.07229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.07240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.07263: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855269.07266: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.07269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.07341: Set connection var ansible_timeout to 10 30582 1726855269.07344: Set connection var ansible_connection to ssh 30582 1726855269.07351: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855269.07353: Set connection var ansible_pipelining to False 30582 1726855269.07363: Set connection var ansible_shell_executable to /bin/sh 30582 1726855269.07365: Set connection var ansible_shell_type to sh 30582 1726855269.07378: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.07381: variable 'ansible_connection' from source: unknown 30582 1726855269.07384: variable 'ansible_module_compression' from source: unknown 30582 1726855269.07386: variable 'ansible_shell_type' from source: unknown 30582 1726855269.07390: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.07392: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.07396: variable 'ansible_pipelining' from source: unknown 30582 1726855269.07401: variable 'ansible_timeout' from source: unknown 30582 1726855269.07405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.07507: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855269.07515: variable 'omit' from source: magic vars 30582 1726855269.07520: starting attempt loop 30582 1726855269.07523: running the handler 30582 1726855269.07558: handler run complete 30582 1726855269.07568: attempt loop complete, returning result 30582 1726855269.07570: _execute() done 30582 1726855269.07575: dumping result to json 30582 1726855269.07577: done dumping result, returning 30582 1726855269.07589: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcc66-ac2b-aa83-7d57-0000000000bb] 30582 1726855269.07592: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000000bb 30582 1726855269.07667: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000000bb 30582 1726855269.07669: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30582 1726855269.07735: no more pending results, returning what we have 30582 1726855269.07738: results queue empty 30582 1726855269.07739: checking for any_errors_fatal 30582 1726855269.07746: done checking for any_errors_fatal 30582 1726855269.07747: checking for max_fail_percentage 30582 1726855269.07748: done checking for max_fail_percentage 30582 1726855269.07749: checking to see if all hosts have failed and the running result is not ok 30582 1726855269.07749: done checking to see if all hosts have failed 30582 1726855269.07750: getting the remaining hosts for this loop 30582 1726855269.07751: done getting the remaining hosts for this loop 30582 1726855269.07755: getting the next task for host managed_node3 30582 1726855269.07761: done getting next task for host managed_node3 30582 1726855269.07764: ^ task is: TASK: Setup 30582 1726855269.07766: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855269.07769: getting variables 30582 1726855269.07770: in VariableManager get_vars() 30582 1726855269.07795: Calling all_inventory to load vars for managed_node3 30582 1726855269.07797: Calling groups_inventory to load vars for managed_node3 30582 1726855269.07800: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.07809: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.07811: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.07813: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.07997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.08205: done with get_vars() 30582 1726855269.08215: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 14:01:09 -0400 (0:00:00.022) 0:00:05.433 ****** 30582 1726855269.08327: entering _queue_task() for managed_node3/include_tasks 30582 1726855269.08660: worker is 1 (out of 1 available) 30582 1726855269.08670: exiting _queue_task() for managed_node3/include_tasks 30582 1726855269.08680: done queuing things up, now waiting for results queue to drain 30582 1726855269.08681: waiting for pending results... 30582 1726855269.08863: running TaskExecutor() for managed_node3/TASK: Setup 30582 1726855269.08918: in run() - task 0affcc66-ac2b-aa83-7d57-000000000094 30582 1726855269.08936: variable 'ansible_search_path' from source: unknown 30582 1726855269.08944: variable 'ansible_search_path' from source: unknown 30582 1726855269.08991: variable 'lsr_setup' from source: include params 30582 1726855269.09185: variable 'lsr_setup' from source: include params 30582 1726855269.09255: variable 'omit' from source: magic vars 30582 1726855269.09444: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.09491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.09494: variable 'omit' from source: magic vars 30582 1726855269.09692: variable 'ansible_distribution_major_version' from source: facts 30582 1726855269.09709: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855269.09722: variable 'item' from source: unknown 30582 1726855269.09786: variable 'item' from source: unknown 30582 1726855269.09832: variable 'item' from source: unknown 30582 1726855269.09898: variable 'item' from source: unknown 30582 1726855269.10263: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.10266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.10269: variable 'omit' from source: magic vars 30582 1726855269.10271: variable 'ansible_distribution_major_version' from source: facts 30582 1726855269.10273: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855269.10275: variable 'item' from source: unknown 30582 1726855269.10329: variable 'item' from source: unknown 30582 1726855269.10358: variable 'item' from source: unknown 30582 1726855269.10424: variable 'item' from source: unknown 30582 1726855269.10507: dumping result to json 30582 1726855269.10516: done dumping result, returning 30582 1726855269.10525: done running TaskExecutor() for managed_node3/TASK: Setup [0affcc66-ac2b-aa83-7d57-000000000094] 30582 1726855269.10533: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000094 30582 1726855269.10628: no more pending results, returning what we have 30582 1726855269.10634: in VariableManager get_vars() 30582 1726855269.10666: Calling all_inventory to load vars for managed_node3 30582 1726855269.10669: Calling groups_inventory to load vars for managed_node3 30582 1726855269.10672: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.10686: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.10694: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.10700: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.11126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.11358: done with get_vars() 30582 1726855269.11365: variable 'ansible_search_path' from source: unknown 30582 1726855269.11366: variable 'ansible_search_path' from source: unknown 30582 1726855269.11393: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000094 30582 1726855269.11396: WORKER PROCESS EXITING 30582 1726855269.11439: variable 'ansible_search_path' from source: unknown 30582 1726855269.11440: variable 'ansible_search_path' from source: unknown 30582 1726855269.11463: we have included files to process 30582 1726855269.11464: generating all_blocks data 30582 1726855269.11466: done generating all_blocks data 30582 1726855269.11469: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30582 1726855269.11470: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30582 1726855269.11472: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30582 1726855269.11814: done processing included file 30582 1726855269.11816: iterating over new_blocks loaded from include file 30582 1726855269.11818: in VariableManager get_vars() 30582 1726855269.11831: done with get_vars() 30582 1726855269.11832: filtering new block on tags 30582 1726855269.11860: done filtering new block on tags 30582 1726855269.11862: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 => (item=tasks/delete_interface.yml) 30582 1726855269.11867: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855269.11868: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855269.11870: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855269.11992: in VariableManager get_vars() 30582 1726855269.12010: done with get_vars() 30582 1726855269.12123: done processing included file 30582 1726855269.12125: iterating over new_blocks loaded from include file 30582 1726855269.12126: in VariableManager get_vars() 30582 1726855269.12138: done with get_vars() 30582 1726855269.12140: filtering new block on tags 30582 1726855269.12169: done filtering new block on tags 30582 1726855269.12171: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item=tasks/assert_device_absent.yml) 30582 1726855269.12174: extending task lists for all hosts with included blocks 30582 1726855269.12827: done extending task lists 30582 1726855269.12828: done processing included files 30582 1726855269.12829: results queue empty 30582 1726855269.12834: checking for any_errors_fatal 30582 1726855269.12837: done checking for any_errors_fatal 30582 1726855269.12838: checking for max_fail_percentage 30582 1726855269.12839: done checking for max_fail_percentage 30582 1726855269.12840: checking to see if all hosts have failed and the running result is not ok 30582 1726855269.12841: done checking to see if all hosts have failed 30582 1726855269.12841: getting the remaining hosts for this loop 30582 1726855269.12843: done getting the remaining hosts for this loop 30582 1726855269.12845: getting the next task for host managed_node3 30582 1726855269.12849: done getting next task for host managed_node3 30582 1726855269.12851: ^ task is: TASK: Remove test interface if necessary 30582 1726855269.12854: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855269.12857: getting variables 30582 1726855269.12858: in VariableManager get_vars() 30582 1726855269.12870: Calling all_inventory to load vars for managed_node3 30582 1726855269.12872: Calling groups_inventory to load vars for managed_node3 30582 1726855269.12875: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.12880: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.12882: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.12885: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.13043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.13270: done with get_vars() 30582 1726855269.13279: done getting variables 30582 1726855269.13321: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 14:01:09 -0400 (0:00:00.050) 0:00:05.483 ****** 30582 1726855269.13349: entering _queue_task() for managed_node3/command 30582 1726855269.13644: worker is 1 (out of 1 available) 30582 1726855269.13656: exiting _queue_task() for managed_node3/command 30582 1726855269.13666: done queuing things up, now waiting for results queue to drain 30582 1726855269.13668: waiting for pending results... 30582 1726855269.13923: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 30582 1726855269.14028: in run() - task 0affcc66-ac2b-aa83-7d57-00000000011b 30582 1726855269.14054: variable 'ansible_search_path' from source: unknown 30582 1726855269.14061: variable 'ansible_search_path' from source: unknown 30582 1726855269.14101: calling self._execute() 30582 1726855269.14178: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.14190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.14206: variable 'omit' from source: magic vars 30582 1726855269.14560: variable 'ansible_distribution_major_version' from source: facts 30582 1726855269.14576: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855269.14599: variable 'omit' from source: magic vars 30582 1726855269.14648: variable 'omit' from source: magic vars 30582 1726855269.14753: variable 'interface' from source: play vars 30582 1726855269.14775: variable 'omit' from source: magic vars 30582 1726855269.14825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855269.14862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855269.14886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855269.14923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.14941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.15023: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855269.15026: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.15029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.15106: Set connection var ansible_timeout to 10 30582 1726855269.15114: Set connection var ansible_connection to ssh 30582 1726855269.15131: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855269.15140: Set connection var ansible_pipelining to False 30582 1726855269.15148: Set connection var ansible_shell_executable to /bin/sh 30582 1726855269.15154: Set connection var ansible_shell_type to sh 30582 1726855269.15177: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.15185: variable 'ansible_connection' from source: unknown 30582 1726855269.15193: variable 'ansible_module_compression' from source: unknown 30582 1726855269.15239: variable 'ansible_shell_type' from source: unknown 30582 1726855269.15242: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.15244: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.15246: variable 'ansible_pipelining' from source: unknown 30582 1726855269.15248: variable 'ansible_timeout' from source: unknown 30582 1726855269.15250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.15392: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855269.15413: variable 'omit' from source: magic vars 30582 1726855269.15461: starting attempt loop 30582 1726855269.15464: running the handler 30582 1726855269.15466: _low_level_execute_command(): starting 30582 1726855269.15468: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855269.16212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855269.16307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.16350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855269.16364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855269.16385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.16486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.18152: stdout chunk (state=3): >>>/root <<< 30582 1726855269.18313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.18317: stdout chunk (state=3): >>><<< 30582 1726855269.18319: stderr chunk (state=3): >>><<< 30582 1726855269.18341: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855269.18359: _low_level_execute_command(): starting 30582 1726855269.18442: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032 `" && echo ansible-tmp-1726855269.1834772-30872-100735372644032="` echo /root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032 `" ) && sleep 0' 30582 1726855269.19006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855269.19020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855269.19039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855269.19148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855269.19174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855269.19199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.19279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.21277: stdout chunk (state=3): >>>ansible-tmp-1726855269.1834772-30872-100735372644032=/root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032 <<< 30582 1726855269.21334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.21359: stderr chunk (state=3): >>><<< 30582 1726855269.21362: stdout chunk (state=3): >>><<< 30582 1726855269.21380: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855269.1834772-30872-100735372644032=/root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855269.21794: variable 'ansible_module_compression' from source: unknown 30582 1726855269.21798: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855269.21800: variable 'ansible_facts' from source: unknown 30582 1726855269.21803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/AnsiballZ_command.py 30582 1726855269.22150: Sending initial data 30582 1726855269.22159: Sent initial data (156 bytes) 30582 1726855269.23794: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855269.23798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.23801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855269.23803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.24218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.24241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.25834: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855269.25905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855269.25984: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmptpexz6w1 /root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/AnsiballZ_command.py <<< 30582 1726855269.26007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/AnsiballZ_command.py" <<< 30582 1726855269.26139: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmptpexz6w1" to remote "/root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/AnsiballZ_command.py" <<< 30582 1726855269.27553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.27616: stderr chunk (state=3): >>><<< 30582 1726855269.27625: stdout chunk (state=3): >>><<< 30582 1726855269.27679: done transferring module to remote 30582 1726855269.27708: _low_level_execute_command(): starting 30582 1726855269.27876: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/ /root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/AnsiballZ_command.py && sleep 0' 30582 1726855269.28993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855269.28999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855269.29002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.29004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855269.29006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.29074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.29183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.31112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.31139: stderr chunk (state=3): >>><<< 30582 1726855269.31150: stdout chunk (state=3): >>><<< 30582 1726855269.31218: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855269.31228: _low_level_execute_command(): starting 30582 1726855269.31247: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/AnsiballZ_command.py && sleep 0' 30582 1726855269.32510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855269.32516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855269.32712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.32797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.48605: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 14:01:09.477490", "end": "2024-09-20 14:01:09.484812", "delta": "0:00:00.007322", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855269.50081: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. <<< 30582 1726855269.50085: stdout chunk (state=3): >>><<< 30582 1726855269.50093: stderr chunk (state=3): >>><<< 30582 1726855269.50369: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 14:01:09.477490", "end": "2024-09-20 14:01:09.484812", "delta": "0:00:00.007322", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855269.50374: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855269.50378: _low_level_execute_command(): starting 30582 1726855269.50380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855269.1834772-30872-100735372644032/ > /dev/null 2>&1 && sleep 0' 30582 1726855269.51646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855269.51702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.51731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855269.51744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.51845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.53726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.53734: stdout chunk (state=3): >>><<< 30582 1726855269.53736: stderr chunk (state=3): >>><<< 30582 1726855269.53750: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855269.53764: handler run complete 30582 1726855269.53791: Evaluated conditional (False): False 30582 1726855269.54025: attempt loop complete, returning result 30582 1726855269.54034: _execute() done 30582 1726855269.54040: dumping result to json 30582 1726855269.54050: done dumping result, returning 30582 1726855269.54061: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [0affcc66-ac2b-aa83-7d57-00000000011b] 30582 1726855269.54069: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000011b fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.007322", "end": "2024-09-20 14:01:09.484812", "rc": 1, "start": "2024-09-20 14:01:09.477490" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30582 1726855269.54263: no more pending results, returning what we have 30582 1726855269.54268: results queue empty 30582 1726855269.54269: checking for any_errors_fatal 30582 1726855269.54270: done checking for any_errors_fatal 30582 1726855269.54271: checking for max_fail_percentage 30582 1726855269.54273: done checking for max_fail_percentage 30582 1726855269.54273: checking to see if all hosts have failed and the running result is not ok 30582 1726855269.54274: done checking to see if all hosts have failed 30582 1726855269.54275: getting the remaining hosts for this loop 30582 1726855269.54276: done getting the remaining hosts for this loop 30582 1726855269.54280: getting the next task for host managed_node3 30582 1726855269.54292: done getting next task for host managed_node3 30582 1726855269.54296: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30582 1726855269.54300: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855269.54308: getting variables 30582 1726855269.54310: in VariableManager get_vars() 30582 1726855269.54344: Calling all_inventory to load vars for managed_node3 30582 1726855269.54347: Calling groups_inventory to load vars for managed_node3 30582 1726855269.54351: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.54429: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.54433: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.54436: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.54857: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000011b 30582 1726855269.54861: WORKER PROCESS EXITING 30582 1726855269.54884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.55328: done with get_vars() 30582 1726855269.55340: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 14:01:09 -0400 (0:00:00.422) 0:00:05.905 ****** 30582 1726855269.55553: entering _queue_task() for managed_node3/include_tasks 30582 1726855269.55920: worker is 1 (out of 1 available) 30582 1726855269.55932: exiting _queue_task() for managed_node3/include_tasks 30582 1726855269.55942: done queuing things up, now waiting for results queue to drain 30582 1726855269.55944: waiting for pending results... 30582 1726855269.56133: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30582 1726855269.56240: in run() - task 0affcc66-ac2b-aa83-7d57-00000000011f 30582 1726855269.56259: variable 'ansible_search_path' from source: unknown 30582 1726855269.56266: variable 'ansible_search_path' from source: unknown 30582 1726855269.56313: calling self._execute() 30582 1726855269.56385: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.56401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.56416: variable 'omit' from source: magic vars 30582 1726855269.56771: variable 'ansible_distribution_major_version' from source: facts 30582 1726855269.56789: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855269.56802: _execute() done 30582 1726855269.56810: dumping result to json 30582 1726855269.56818: done dumping result, returning 30582 1726855269.56828: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-aa83-7d57-00000000011f] 30582 1726855269.56836: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000011f 30582 1726855269.57007: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000011f 30582 1726855269.57010: WORKER PROCESS EXITING 30582 1726855269.57036: no more pending results, returning what we have 30582 1726855269.57041: in VariableManager get_vars() 30582 1726855269.57075: Calling all_inventory to load vars for managed_node3 30582 1726855269.57078: Calling groups_inventory to load vars for managed_node3 30582 1726855269.57081: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.57099: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.57102: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.57106: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.57471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.57849: done with get_vars() 30582 1726855269.57857: variable 'ansible_search_path' from source: unknown 30582 1726855269.57858: variable 'ansible_search_path' from source: unknown 30582 1726855269.57867: variable 'item' from source: include params 30582 1726855269.57982: variable 'item' from source: include params 30582 1726855269.58241: we have included files to process 30582 1726855269.58242: generating all_blocks data 30582 1726855269.58244: done generating all_blocks data 30582 1726855269.58248: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855269.58249: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855269.58252: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855269.58773: done processing included file 30582 1726855269.58775: iterating over new_blocks loaded from include file 30582 1726855269.58776: in VariableManager get_vars() 30582 1726855269.58830: done with get_vars() 30582 1726855269.58832: filtering new block on tags 30582 1726855269.58861: done filtering new block on tags 30582 1726855269.58864: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30582 1726855269.58869: extending task lists for all hosts with included blocks 30582 1726855269.59072: done extending task lists 30582 1726855269.59073: done processing included files 30582 1726855269.59074: results queue empty 30582 1726855269.59075: checking for any_errors_fatal 30582 1726855269.59080: done checking for any_errors_fatal 30582 1726855269.59081: checking for max_fail_percentage 30582 1726855269.59083: done checking for max_fail_percentage 30582 1726855269.59084: checking to see if all hosts have failed and the running result is not ok 30582 1726855269.59085: done checking to see if all hosts have failed 30582 1726855269.59086: getting the remaining hosts for this loop 30582 1726855269.59090: done getting the remaining hosts for this loop 30582 1726855269.59093: getting the next task for host managed_node3 30582 1726855269.59104: done getting next task for host managed_node3 30582 1726855269.59106: ^ task is: TASK: Get stat for interface {{ interface }} 30582 1726855269.59110: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855269.59113: getting variables 30582 1726855269.59114: in VariableManager get_vars() 30582 1726855269.59122: Calling all_inventory to load vars for managed_node3 30582 1726855269.59124: Calling groups_inventory to load vars for managed_node3 30582 1726855269.59127: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.59132: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.59135: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.59137: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.59284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.59510: done with get_vars() 30582 1726855269.59519: done getting variables 30582 1726855269.59648: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 14:01:09 -0400 (0:00:00.041) 0:00:05.946 ****** 30582 1726855269.59677: entering _queue_task() for managed_node3/stat 30582 1726855269.60126: worker is 1 (out of 1 available) 30582 1726855269.60139: exiting _queue_task() for managed_node3/stat 30582 1726855269.60151: done queuing things up, now waiting for results queue to drain 30582 1726855269.60152: waiting for pending results... 30582 1726855269.60538: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30582 1726855269.60632: in run() - task 0affcc66-ac2b-aa83-7d57-00000000016e 30582 1726855269.60636: variable 'ansible_search_path' from source: unknown 30582 1726855269.60655: variable 'ansible_search_path' from source: unknown 30582 1726855269.60685: calling self._execute() 30582 1726855269.60848: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.60852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.60854: variable 'omit' from source: magic vars 30582 1726855269.61164: variable 'ansible_distribution_major_version' from source: facts 30582 1726855269.61186: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855269.61206: variable 'omit' from source: magic vars 30582 1726855269.61261: variable 'omit' from source: magic vars 30582 1726855269.61374: variable 'interface' from source: play vars 30582 1726855269.61407: variable 'omit' from source: magic vars 30582 1726855269.61452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855269.61502: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855269.61593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855269.61599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.61602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.61608: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855269.61611: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.61613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.61721: Set connection var ansible_timeout to 10 30582 1726855269.61728: Set connection var ansible_connection to ssh 30582 1726855269.61745: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855269.61754: Set connection var ansible_pipelining to False 30582 1726855269.61763: Set connection var ansible_shell_executable to /bin/sh 30582 1726855269.61771: Set connection var ansible_shell_type to sh 30582 1726855269.61804: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.61814: variable 'ansible_connection' from source: unknown 30582 1726855269.61827: variable 'ansible_module_compression' from source: unknown 30582 1726855269.61846: variable 'ansible_shell_type' from source: unknown 30582 1726855269.61849: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.61851: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.61898: variable 'ansible_pipelining' from source: unknown 30582 1726855269.61902: variable 'ansible_timeout' from source: unknown 30582 1726855269.61904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.62113: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855269.62130: variable 'omit' from source: magic vars 30582 1726855269.62136: starting attempt loop 30582 1726855269.62139: running the handler 30582 1726855269.62166: _low_level_execute_command(): starting 30582 1726855269.62169: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855269.62875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855269.62926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.62932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855269.62935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.62989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.64619: stdout chunk (state=3): >>>/root <<< 30582 1726855269.64738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.64742: stdout chunk (state=3): >>><<< 30582 1726855269.64751: stderr chunk (state=3): >>><<< 30582 1726855269.64767: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855269.64778: _low_level_execute_command(): starting 30582 1726855269.64784: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122 `" && echo ansible-tmp-1726855269.64768-30900-211189850748122="` echo /root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122 `" ) && sleep 0' 30582 1726855269.65207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855269.65211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.65217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855269.65229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.65275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855269.65279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.65348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.67234: stdout chunk (state=3): >>>ansible-tmp-1726855269.64768-30900-211189850748122=/root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122 <<< 30582 1726855269.67414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.67417: stderr chunk (state=3): >>><<< 30582 1726855269.67421: stdout chunk (state=3): >>><<< 30582 1726855269.67457: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855269.64768-30900-211189850748122=/root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855269.67487: variable 'ansible_module_compression' from source: unknown 30582 1726855269.67530: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855269.67557: variable 'ansible_facts' from source: unknown 30582 1726855269.67632: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/AnsiballZ_stat.py 30582 1726855269.67728: Sending initial data 30582 1726855269.67731: Sent initial data (151 bytes) 30582 1726855269.68136: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.68178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855269.68182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.68244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.69799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855269.69862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855269.70036: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmprc1jm1z6 /root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/AnsiballZ_stat.py <<< 30582 1726855269.70040: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/AnsiballZ_stat.py" <<< 30582 1726855269.70073: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmprc1jm1z6" to remote "/root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/AnsiballZ_stat.py" <<< 30582 1726855269.70864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.70908: stderr chunk (state=3): >>><<< 30582 1726855269.70912: stdout chunk (state=3): >>><<< 30582 1726855269.70936: done transferring module to remote 30582 1726855269.70944: _low_level_execute_command(): starting 30582 1726855269.70948: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/ /root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/AnsiballZ_stat.py && sleep 0' 30582 1726855269.71366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855269.71370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855269.71372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855269.71374: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855269.71377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.71435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855269.71444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.71493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.73229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.73250: stderr chunk (state=3): >>><<< 30582 1726855269.73254: stdout chunk (state=3): >>><<< 30582 1726855269.73271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855269.73275: _low_level_execute_command(): starting 30582 1726855269.73279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/AnsiballZ_stat.py && sleep 0' 30582 1726855269.73710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855269.73713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.73715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855269.73717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855269.73719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.73766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855269.73769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.73839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.88792: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855269.90173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855269.90177: stdout chunk (state=3): >>><<< 30582 1726855269.90179: stderr chunk (state=3): >>><<< 30582 1726855269.90298: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855269.90303: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855269.90306: _low_level_execute_command(): starting 30582 1726855269.90309: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855269.64768-30900-211189850748122/ > /dev/null 2>&1 && sleep 0' 30582 1726855269.90977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855269.91000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855269.91060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855269.91138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855269.91170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855269.91285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855269.93278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855269.93282: stdout chunk (state=3): >>><<< 30582 1726855269.93284: stderr chunk (state=3): >>><<< 30582 1726855269.93306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855269.93468: handler run complete 30582 1726855269.93584: attempt loop complete, returning result 30582 1726855269.93589: _execute() done 30582 1726855269.93592: dumping result to json 30582 1726855269.93596: done dumping result, returning 30582 1726855269.93598: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcc66-ac2b-aa83-7d57-00000000016e] 30582 1726855269.93600: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000016e 30582 1726855269.93667: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000016e 30582 1726855269.93670: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855269.93733: no more pending results, returning what we have 30582 1726855269.93737: results queue empty 30582 1726855269.93738: checking for any_errors_fatal 30582 1726855269.93740: done checking for any_errors_fatal 30582 1726855269.93741: checking for max_fail_percentage 30582 1726855269.93743: done checking for max_fail_percentage 30582 1726855269.93743: checking to see if all hosts have failed and the running result is not ok 30582 1726855269.93744: done checking to see if all hosts have failed 30582 1726855269.93745: getting the remaining hosts for this loop 30582 1726855269.93746: done getting the remaining hosts for this loop 30582 1726855269.93750: getting the next task for host managed_node3 30582 1726855269.93759: done getting next task for host managed_node3 30582 1726855269.93761: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30582 1726855269.93766: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855269.93771: getting variables 30582 1726855269.93773: in VariableManager get_vars() 30582 1726855269.93919: Calling all_inventory to load vars for managed_node3 30582 1726855269.93922: Calling groups_inventory to load vars for managed_node3 30582 1726855269.93925: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.93937: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.93939: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.93942: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.94776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.95286: done with get_vars() 30582 1726855269.95299: done getting variables 30582 1726855269.95515: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 30582 1726855269.95638: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 14:01:09 -0400 (0:00:00.359) 0:00:06.306 ****** 30582 1726855269.95667: entering _queue_task() for managed_node3/assert 30582 1726855269.95668: Creating lock for assert 30582 1726855269.96114: worker is 1 (out of 1 available) 30582 1726855269.96124: exiting _queue_task() for managed_node3/assert 30582 1726855269.96134: done queuing things up, now waiting for results queue to drain 30582 1726855269.96135: waiting for pending results... 30582 1726855269.96306: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30582 1726855269.96343: in run() - task 0affcc66-ac2b-aa83-7d57-000000000120 30582 1726855269.96366: variable 'ansible_search_path' from source: unknown 30582 1726855269.96373: variable 'ansible_search_path' from source: unknown 30582 1726855269.96415: calling self._execute() 30582 1726855269.96503: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.96513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.96525: variable 'omit' from source: magic vars 30582 1726855269.96890: variable 'ansible_distribution_major_version' from source: facts 30582 1726855269.96918: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855269.96929: variable 'omit' from source: magic vars 30582 1726855269.96977: variable 'omit' from source: magic vars 30582 1726855269.97096: variable 'interface' from source: play vars 30582 1726855269.97122: variable 'omit' from source: magic vars 30582 1726855269.97198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855269.97204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855269.97235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855269.97259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.97277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855269.97314: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855269.97344: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.97347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.97492: Set connection var ansible_timeout to 10 30582 1726855269.97498: Set connection var ansible_connection to ssh 30582 1726855269.97501: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855269.97503: Set connection var ansible_pipelining to False 30582 1726855269.97506: Set connection var ansible_shell_executable to /bin/sh 30582 1726855269.97508: Set connection var ansible_shell_type to sh 30582 1726855269.97521: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.97529: variable 'ansible_connection' from source: unknown 30582 1726855269.97536: variable 'ansible_module_compression' from source: unknown 30582 1726855269.97543: variable 'ansible_shell_type' from source: unknown 30582 1726855269.97666: variable 'ansible_shell_executable' from source: unknown 30582 1726855269.97669: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855269.97672: variable 'ansible_pipelining' from source: unknown 30582 1726855269.97674: variable 'ansible_timeout' from source: unknown 30582 1726855269.97677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855269.97729: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855269.97744: variable 'omit' from source: magic vars 30582 1726855269.97754: starting attempt loop 30582 1726855269.97761: running the handler 30582 1726855269.97916: variable 'interface_stat' from source: set_fact 30582 1726855269.97930: Evaluated conditional (not interface_stat.stat.exists): True 30582 1726855269.97940: handler run complete 30582 1726855269.97957: attempt loop complete, returning result 30582 1726855269.97964: _execute() done 30582 1726855269.97970: dumping result to json 30582 1726855269.97977: done dumping result, returning 30582 1726855269.97998: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcc66-ac2b-aa83-7d57-000000000120] 30582 1726855269.98010: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000120 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855269.98246: no more pending results, returning what we have 30582 1726855269.98250: results queue empty 30582 1726855269.98251: checking for any_errors_fatal 30582 1726855269.98259: done checking for any_errors_fatal 30582 1726855269.98259: checking for max_fail_percentage 30582 1726855269.98261: done checking for max_fail_percentage 30582 1726855269.98262: checking to see if all hosts have failed and the running result is not ok 30582 1726855269.98263: done checking to see if all hosts have failed 30582 1726855269.98263: getting the remaining hosts for this loop 30582 1726855269.98265: done getting the remaining hosts for this loop 30582 1726855269.98269: getting the next task for host managed_node3 30582 1726855269.98276: done getting next task for host managed_node3 30582 1726855269.98280: ^ task is: TASK: Test 30582 1726855269.98284: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855269.98293: getting variables 30582 1726855269.98297: in VariableManager get_vars() 30582 1726855269.98325: Calling all_inventory to load vars for managed_node3 30582 1726855269.98328: Calling groups_inventory to load vars for managed_node3 30582 1726855269.98331: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855269.98343: Calling all_plugins_play to load vars for managed_node3 30582 1726855269.98346: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855269.98349: Calling groups_plugins_play to load vars for managed_node3 30582 1726855269.98708: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000120 30582 1726855269.98711: WORKER PROCESS EXITING 30582 1726855269.98736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855269.98977: done with get_vars() 30582 1726855269.98989: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 14:01:09 -0400 (0:00:00.034) 0:00:06.340 ****** 30582 1726855269.99091: entering _queue_task() for managed_node3/include_tasks 30582 1726855269.99511: worker is 1 (out of 1 available) 30582 1726855269.99523: exiting _queue_task() for managed_node3/include_tasks 30582 1726855269.99532: done queuing things up, now waiting for results queue to drain 30582 1726855269.99534: waiting for pending results... 30582 1726855269.99714: running TaskExecutor() for managed_node3/TASK: Test 30582 1726855269.99832: in run() - task 0affcc66-ac2b-aa83-7d57-000000000095 30582 1726855269.99868: variable 'ansible_search_path' from source: unknown 30582 1726855269.99871: variable 'ansible_search_path' from source: unknown 30582 1726855269.99912: variable 'lsr_test' from source: include params 30582 1726855270.00119: variable 'lsr_test' from source: include params 30582 1726855270.00209: variable 'omit' from source: magic vars 30582 1726855270.00377: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855270.00380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855270.00383: variable 'omit' from source: magic vars 30582 1726855270.00615: variable 'ansible_distribution_major_version' from source: facts 30582 1726855270.00634: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855270.00644: variable 'item' from source: unknown 30582 1726855270.00712: variable 'item' from source: unknown 30582 1726855270.00749: variable 'item' from source: unknown 30582 1726855270.00814: variable 'item' from source: unknown 30582 1726855270.01055: dumping result to json 30582 1726855270.01058: done dumping result, returning 30582 1726855270.01060: done running TaskExecutor() for managed_node3/TASK: Test [0affcc66-ac2b-aa83-7d57-000000000095] 30582 1726855270.01062: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000095 30582 1726855270.01106: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000095 30582 1726855270.01109: WORKER PROCESS EXITING 30582 1726855270.01184: no more pending results, returning what we have 30582 1726855270.01192: in VariableManager get_vars() 30582 1726855270.01226: Calling all_inventory to load vars for managed_node3 30582 1726855270.01229: Calling groups_inventory to load vars for managed_node3 30582 1726855270.01232: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855270.01251: Calling all_plugins_play to load vars for managed_node3 30582 1726855270.01254: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855270.01258: Calling groups_plugins_play to load vars for managed_node3 30582 1726855270.01968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855270.02438: done with get_vars() 30582 1726855270.02447: variable 'ansible_search_path' from source: unknown 30582 1726855270.02448: variable 'ansible_search_path' from source: unknown 30582 1726855270.02603: we have included files to process 30582 1726855270.02605: generating all_blocks data 30582 1726855270.02607: done generating all_blocks data 30582 1726855270.02610: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855270.02611: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855270.02616: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855270.03232: done processing included file 30582 1726855270.03233: iterating over new_blocks loaded from include file 30582 1726855270.03235: in VariableManager get_vars() 30582 1726855270.03361: done with get_vars() 30582 1726855270.03363: filtering new block on tags 30582 1726855270.03405: done filtering new block on tags 30582 1726855270.03407: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30582 1726855270.03412: extending task lists for all hosts with included blocks 30582 1726855270.05165: done extending task lists 30582 1726855270.05167: done processing included files 30582 1726855270.05168: results queue empty 30582 1726855270.05169: checking for any_errors_fatal 30582 1726855270.05172: done checking for any_errors_fatal 30582 1726855270.05173: checking for max_fail_percentage 30582 1726855270.05174: done checking for max_fail_percentage 30582 1726855270.05175: checking to see if all hosts have failed and the running result is not ok 30582 1726855270.05176: done checking to see if all hosts have failed 30582 1726855270.05177: getting the remaining hosts for this loop 30582 1726855270.05178: done getting the remaining hosts for this loop 30582 1726855270.05181: getting the next task for host managed_node3 30582 1726855270.05186: done getting next task for host managed_node3 30582 1726855270.05314: ^ task is: TASK: Include network role 30582 1726855270.05319: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855270.05322: getting variables 30582 1726855270.05323: in VariableManager get_vars() 30582 1726855270.05335: Calling all_inventory to load vars for managed_node3 30582 1726855270.05338: Calling groups_inventory to load vars for managed_node3 30582 1726855270.05341: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855270.05347: Calling all_plugins_play to load vars for managed_node3 30582 1726855270.05349: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855270.05352: Calling groups_plugins_play to load vars for managed_node3 30582 1726855270.05648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855270.05872: done with get_vars() 30582 1726855270.05885: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 14:01:10 -0400 (0:00:00.068) 0:00:06.409 ****** 30582 1726855270.05973: entering _queue_task() for managed_node3/include_role 30582 1726855270.05975: Creating lock for include_role 30582 1726855270.06428: worker is 1 (out of 1 available) 30582 1726855270.06440: exiting _queue_task() for managed_node3/include_role 30582 1726855270.06451: done queuing things up, now waiting for results queue to drain 30582 1726855270.06452: waiting for pending results... 30582 1726855270.06741: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855270.06763: in run() - task 0affcc66-ac2b-aa83-7d57-00000000018e 30582 1726855270.06786: variable 'ansible_search_path' from source: unknown 30582 1726855270.06800: variable 'ansible_search_path' from source: unknown 30582 1726855270.06845: calling self._execute() 30582 1726855270.06924: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855270.06947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855270.06991: variable 'omit' from source: magic vars 30582 1726855270.07343: variable 'ansible_distribution_major_version' from source: facts 30582 1726855270.07360: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855270.07370: _execute() done 30582 1726855270.07382: dumping result to json 30582 1726855270.07396: done dumping result, returning 30582 1726855270.07407: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-00000000018e] 30582 1726855270.07496: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000018e 30582 1726855270.07579: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000018e 30582 1726855270.07582: WORKER PROCESS EXITING 30582 1726855270.07620: no more pending results, returning what we have 30582 1726855270.07625: in VariableManager get_vars() 30582 1726855270.07659: Calling all_inventory to load vars for managed_node3 30582 1726855270.07663: Calling groups_inventory to load vars for managed_node3 30582 1726855270.07667: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855270.07680: Calling all_plugins_play to load vars for managed_node3 30582 1726855270.07683: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855270.07686: Calling groups_plugins_play to load vars for managed_node3 30582 1726855270.08130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855270.08353: done with get_vars() 30582 1726855270.08360: variable 'ansible_search_path' from source: unknown 30582 1726855270.08361: variable 'ansible_search_path' from source: unknown 30582 1726855270.08539: variable 'omit' from source: magic vars 30582 1726855270.08578: variable 'omit' from source: magic vars 30582 1726855270.08590: variable 'omit' from source: magic vars 30582 1726855270.08593: we have included files to process 30582 1726855270.08593: generating all_blocks data 30582 1726855270.08596: done generating all_blocks data 30582 1726855270.08597: processing included file: fedora.linux_system_roles.network 30582 1726855270.08611: in VariableManager get_vars() 30582 1726855270.08619: done with get_vars() 30582 1726855270.08666: in VariableManager get_vars() 30582 1726855270.08705: done with get_vars() 30582 1726855270.08742: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855270.08879: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855270.08964: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855270.09357: in VariableManager get_vars() 30582 1726855270.09370: done with get_vars() 30582 1726855270.09651: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855270.11021: iterating over new_blocks loaded from include file 30582 1726855270.11024: in VariableManager get_vars() 30582 1726855270.11039: done with get_vars() 30582 1726855270.11041: filtering new block on tags 30582 1726855270.11315: done filtering new block on tags 30582 1726855270.11318: in VariableManager get_vars() 30582 1726855270.11332: done with get_vars() 30582 1726855270.11333: filtering new block on tags 30582 1726855270.11350: done filtering new block on tags 30582 1726855270.11352: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855270.11357: extending task lists for all hosts with included blocks 30582 1726855270.11461: done extending task lists 30582 1726855270.11463: done processing included files 30582 1726855270.11464: results queue empty 30582 1726855270.11464: checking for any_errors_fatal 30582 1726855270.11467: done checking for any_errors_fatal 30582 1726855270.11468: checking for max_fail_percentage 30582 1726855270.11469: done checking for max_fail_percentage 30582 1726855270.11469: checking to see if all hosts have failed and the running result is not ok 30582 1726855270.11470: done checking to see if all hosts have failed 30582 1726855270.11470: getting the remaining hosts for this loop 30582 1726855270.11471: done getting the remaining hosts for this loop 30582 1726855270.11473: getting the next task for host managed_node3 30582 1726855270.11476: done getting next task for host managed_node3 30582 1726855270.11478: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855270.11480: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855270.11489: getting variables 30582 1726855270.11489: in VariableManager get_vars() 30582 1726855270.11500: Calling all_inventory to load vars for managed_node3 30582 1726855270.11501: Calling groups_inventory to load vars for managed_node3 30582 1726855270.11502: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855270.11508: Calling all_plugins_play to load vars for managed_node3 30582 1726855270.11510: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855270.11513: Calling groups_plugins_play to load vars for managed_node3 30582 1726855270.11620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855270.11747: done with get_vars() 30582 1726855270.11754: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:01:10 -0400 (0:00:00.058) 0:00:06.467 ****** 30582 1726855270.11805: entering _queue_task() for managed_node3/include_tasks 30582 1726855270.12030: worker is 1 (out of 1 available) 30582 1726855270.12043: exiting _queue_task() for managed_node3/include_tasks 30582 1726855270.12053: done queuing things up, now waiting for results queue to drain 30582 1726855270.12055: waiting for pending results... 30582 1726855270.12212: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855270.12291: in run() - task 0affcc66-ac2b-aa83-7d57-00000000020c 30582 1726855270.12300: variable 'ansible_search_path' from source: unknown 30582 1726855270.12304: variable 'ansible_search_path' from source: unknown 30582 1726855270.12330: calling self._execute() 30582 1726855270.12389: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855270.12393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855270.12406: variable 'omit' from source: magic vars 30582 1726855270.12653: variable 'ansible_distribution_major_version' from source: facts 30582 1726855270.12662: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855270.12667: _execute() done 30582 1726855270.12670: dumping result to json 30582 1726855270.12675: done dumping result, returning 30582 1726855270.12682: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-00000000020c] 30582 1726855270.12689: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000020c 30582 1726855270.12768: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000020c 30582 1726855270.12770: WORKER PROCESS EXITING 30582 1726855270.12813: no more pending results, returning what we have 30582 1726855270.12819: in VariableManager get_vars() 30582 1726855270.12856: Calling all_inventory to load vars for managed_node3 30582 1726855270.12859: Calling groups_inventory to load vars for managed_node3 30582 1726855270.12861: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855270.12871: Calling all_plugins_play to load vars for managed_node3 30582 1726855270.12874: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855270.12876: Calling groups_plugins_play to load vars for managed_node3 30582 1726855270.13025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855270.13167: done with get_vars() 30582 1726855270.13173: variable 'ansible_search_path' from source: unknown 30582 1726855270.13173: variable 'ansible_search_path' from source: unknown 30582 1726855270.13203: we have included files to process 30582 1726855270.13204: generating all_blocks data 30582 1726855270.13206: done generating all_blocks data 30582 1726855270.13208: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855270.13209: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855270.13210: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855270.13830: done processing included file 30582 1726855270.13832: iterating over new_blocks loaded from include file 30582 1726855270.13833: in VariableManager get_vars() 30582 1726855270.13856: done with get_vars() 30582 1726855270.13858: filtering new block on tags 30582 1726855270.13893: done filtering new block on tags 30582 1726855270.13898: in VariableManager get_vars() 30582 1726855270.13919: done with get_vars() 30582 1726855270.13921: filtering new block on tags 30582 1726855270.13964: done filtering new block on tags 30582 1726855270.13967: in VariableManager get_vars() 30582 1726855270.13989: done with get_vars() 30582 1726855270.13991: filtering new block on tags 30582 1726855270.14033: done filtering new block on tags 30582 1726855270.14036: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855270.14041: extending task lists for all hosts with included blocks 30582 1726855270.15372: done extending task lists 30582 1726855270.15373: done processing included files 30582 1726855270.15374: results queue empty 30582 1726855270.15374: checking for any_errors_fatal 30582 1726855270.15376: done checking for any_errors_fatal 30582 1726855270.15377: checking for max_fail_percentage 30582 1726855270.15378: done checking for max_fail_percentage 30582 1726855270.15378: checking to see if all hosts have failed and the running result is not ok 30582 1726855270.15379: done checking to see if all hosts have failed 30582 1726855270.15379: getting the remaining hosts for this loop 30582 1726855270.15380: done getting the remaining hosts for this loop 30582 1726855270.15382: getting the next task for host managed_node3 30582 1726855270.15385: done getting next task for host managed_node3 30582 1726855270.15388: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855270.15391: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855270.15399: getting variables 30582 1726855270.15400: in VariableManager get_vars() 30582 1726855270.15408: Calling all_inventory to load vars for managed_node3 30582 1726855270.15410: Calling groups_inventory to load vars for managed_node3 30582 1726855270.15411: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855270.15414: Calling all_plugins_play to load vars for managed_node3 30582 1726855270.15416: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855270.15417: Calling groups_plugins_play to load vars for managed_node3 30582 1726855270.15509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855270.15632: done with get_vars() 30582 1726855270.15639: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:01:10 -0400 (0:00:00.038) 0:00:06.506 ****** 30582 1726855270.15690: entering _queue_task() for managed_node3/setup 30582 1726855270.15907: worker is 1 (out of 1 available) 30582 1726855270.15920: exiting _queue_task() for managed_node3/setup 30582 1726855270.15931: done queuing things up, now waiting for results queue to drain 30582 1726855270.15933: waiting for pending results... 30582 1726855270.16091: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855270.16180: in run() - task 0affcc66-ac2b-aa83-7d57-000000000269 30582 1726855270.16193: variable 'ansible_search_path' from source: unknown 30582 1726855270.16199: variable 'ansible_search_path' from source: unknown 30582 1726855270.16226: calling self._execute() 30582 1726855270.16282: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855270.16288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855270.16299: variable 'omit' from source: magic vars 30582 1726855270.16805: variable 'ansible_distribution_major_version' from source: facts 30582 1726855270.16892: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855270.17011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855270.18816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855270.18889: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855270.18938: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855270.18977: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855270.19009: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855270.19083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855270.19119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855270.19149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855270.19195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855270.19216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855270.19271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855270.19301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855270.19392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855270.19396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855270.19398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855270.19532: variable '__network_required_facts' from source: role '' defaults 30582 1726855270.19543: variable 'ansible_facts' from source: unknown 30582 1726855270.19599: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855270.19603: when evaluation is False, skipping this task 30582 1726855270.19605: _execute() done 30582 1726855270.19608: dumping result to json 30582 1726855270.19610: done dumping result, returning 30582 1726855270.19616: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-000000000269] 30582 1726855270.19621: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000269 30582 1726855270.19717: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000269 30582 1726855270.19720: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855270.19800: no more pending results, returning what we have 30582 1726855270.19804: results queue empty 30582 1726855270.19805: checking for any_errors_fatal 30582 1726855270.19807: done checking for any_errors_fatal 30582 1726855270.19807: checking for max_fail_percentage 30582 1726855270.19809: done checking for max_fail_percentage 30582 1726855270.19809: checking to see if all hosts have failed and the running result is not ok 30582 1726855270.19810: done checking to see if all hosts have failed 30582 1726855270.19811: getting the remaining hosts for this loop 30582 1726855270.19812: done getting the remaining hosts for this loop 30582 1726855270.19816: getting the next task for host managed_node3 30582 1726855270.19824: done getting next task for host managed_node3 30582 1726855270.19828: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855270.19833: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855270.19845: getting variables 30582 1726855270.19847: in VariableManager get_vars() 30582 1726855270.19879: Calling all_inventory to load vars for managed_node3 30582 1726855270.19882: Calling groups_inventory to load vars for managed_node3 30582 1726855270.19884: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855270.19897: Calling all_plugins_play to load vars for managed_node3 30582 1726855270.19899: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855270.19907: Calling groups_plugins_play to load vars for managed_node3 30582 1726855270.20170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855270.20303: done with get_vars() 30582 1726855270.20311: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:01:10 -0400 (0:00:00.046) 0:00:06.553 ****** 30582 1726855270.20374: entering _queue_task() for managed_node3/stat 30582 1726855270.20576: worker is 1 (out of 1 available) 30582 1726855270.20592: exiting _queue_task() for managed_node3/stat 30582 1726855270.20605: done queuing things up, now waiting for results queue to drain 30582 1726855270.20607: waiting for pending results... 30582 1726855270.20772: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855270.20851: in run() - task 0affcc66-ac2b-aa83-7d57-00000000026b 30582 1726855270.20863: variable 'ansible_search_path' from source: unknown 30582 1726855270.20866: variable 'ansible_search_path' from source: unknown 30582 1726855270.20893: calling self._execute() 30582 1726855270.20954: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855270.20961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855270.20969: variable 'omit' from source: magic vars 30582 1726855270.21256: variable 'ansible_distribution_major_version' from source: facts 30582 1726855270.21466: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855270.21469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855270.21710: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855270.21759: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855270.21813: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855270.21864: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855270.21996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855270.22046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855270.22086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855270.22132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855270.22232: variable '__network_is_ostree' from source: set_fact 30582 1726855270.22251: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855270.22264: when evaluation is False, skipping this task 30582 1726855270.22273: _execute() done 30582 1726855270.22283: dumping result to json 30582 1726855270.22301: done dumping result, returning 30582 1726855270.22309: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-00000000026b] 30582 1726855270.22311: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000026b 30582 1726855270.22407: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000026b 30582 1726855270.22410: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855270.22483: no more pending results, returning what we have 30582 1726855270.22490: results queue empty 30582 1726855270.22491: checking for any_errors_fatal 30582 1726855270.22501: done checking for any_errors_fatal 30582 1726855270.22502: checking for max_fail_percentage 30582 1726855270.22503: done checking for max_fail_percentage 30582 1726855270.22504: checking to see if all hosts have failed and the running result is not ok 30582 1726855270.22505: done checking to see if all hosts have failed 30582 1726855270.22505: getting the remaining hosts for this loop 30582 1726855270.22507: done getting the remaining hosts for this loop 30582 1726855270.22510: getting the next task for host managed_node3 30582 1726855270.22517: done getting next task for host managed_node3 30582 1726855270.22520: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855270.22525: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855270.22537: getting variables 30582 1726855270.22538: in VariableManager get_vars() 30582 1726855270.22566: Calling all_inventory to load vars for managed_node3 30582 1726855270.22569: Calling groups_inventory to load vars for managed_node3 30582 1726855270.22571: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855270.22578: Calling all_plugins_play to load vars for managed_node3 30582 1726855270.22581: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855270.22583: Calling groups_plugins_play to load vars for managed_node3 30582 1726855270.22739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855270.22884: done with get_vars() 30582 1726855270.22902: done getting variables 30582 1726855270.22957: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:01:10 -0400 (0:00:00.026) 0:00:06.579 ****** 30582 1726855270.22995: entering _queue_task() for managed_node3/set_fact 30582 1726855270.23256: worker is 1 (out of 1 available) 30582 1726855270.23271: exiting _queue_task() for managed_node3/set_fact 30582 1726855270.23283: done queuing things up, now waiting for results queue to drain 30582 1726855270.23284: waiting for pending results... 30582 1726855270.23809: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855270.23815: in run() - task 0affcc66-ac2b-aa83-7d57-00000000026c 30582 1726855270.23818: variable 'ansible_search_path' from source: unknown 30582 1726855270.23821: variable 'ansible_search_path' from source: unknown 30582 1726855270.23824: calling self._execute() 30582 1726855270.23832: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855270.23844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855270.23857: variable 'omit' from source: magic vars 30582 1726855270.24269: variable 'ansible_distribution_major_version' from source: facts 30582 1726855270.24276: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855270.24390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855270.24577: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855270.24616: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855270.24641: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855270.24667: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855270.24737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855270.24754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855270.24772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855270.24791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855270.24861: variable '__network_is_ostree' from source: set_fact 30582 1726855270.24867: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855270.24870: when evaluation is False, skipping this task 30582 1726855270.24872: _execute() done 30582 1726855270.24874: dumping result to json 30582 1726855270.24879: done dumping result, returning 30582 1726855270.24889: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-00000000026c] 30582 1726855270.24892: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000026c 30582 1726855270.24978: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000026c 30582 1726855270.24981: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855270.25029: no more pending results, returning what we have 30582 1726855270.25033: results queue empty 30582 1726855270.25034: checking for any_errors_fatal 30582 1726855270.25040: done checking for any_errors_fatal 30582 1726855270.25041: checking for max_fail_percentage 30582 1726855270.25043: done checking for max_fail_percentage 30582 1726855270.25043: checking to see if all hosts have failed and the running result is not ok 30582 1726855270.25044: done checking to see if all hosts have failed 30582 1726855270.25045: getting the remaining hosts for this loop 30582 1726855270.25046: done getting the remaining hosts for this loop 30582 1726855270.25049: getting the next task for host managed_node3 30582 1726855270.25058: done getting next task for host managed_node3 30582 1726855270.25061: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855270.25067: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855270.25081: getting variables 30582 1726855270.25083: in VariableManager get_vars() 30582 1726855270.25265: Calling all_inventory to load vars for managed_node3 30582 1726855270.25269: Calling groups_inventory to load vars for managed_node3 30582 1726855270.25271: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855270.25280: Calling all_plugins_play to load vars for managed_node3 30582 1726855270.25283: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855270.25286: Calling groups_plugins_play to load vars for managed_node3 30582 1726855270.25513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855270.25771: done with get_vars() 30582 1726855270.25783: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:01:10 -0400 (0:00:00.028) 0:00:06.608 ****** 30582 1726855270.25895: entering _queue_task() for managed_node3/service_facts 30582 1726855270.25897: Creating lock for service_facts 30582 1726855270.26182: worker is 1 (out of 1 available) 30582 1726855270.26313: exiting _queue_task() for managed_node3/service_facts 30582 1726855270.26325: done queuing things up, now waiting for results queue to drain 30582 1726855270.26327: waiting for pending results... 30582 1726855270.26544: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855270.26804: in run() - task 0affcc66-ac2b-aa83-7d57-00000000026e 30582 1726855270.26808: variable 'ansible_search_path' from source: unknown 30582 1726855270.26812: variable 'ansible_search_path' from source: unknown 30582 1726855270.26815: calling self._execute() 30582 1726855270.26973: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855270.26977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855270.26980: variable 'omit' from source: magic vars 30582 1726855270.27241: variable 'ansible_distribution_major_version' from source: facts 30582 1726855270.27259: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855270.27267: variable 'omit' from source: magic vars 30582 1726855270.27363: variable 'omit' from source: magic vars 30582 1726855270.27409: variable 'omit' from source: magic vars 30582 1726855270.27459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855270.27506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855270.27527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855270.27556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855270.27567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855270.27609: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855270.27612: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855270.27615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855270.27739: Set connection var ansible_timeout to 10 30582 1726855270.27742: Set connection var ansible_connection to ssh 30582 1726855270.27756: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855270.27761: Set connection var ansible_pipelining to False 30582 1726855270.27767: Set connection var ansible_shell_executable to /bin/sh 30582 1726855270.27770: Set connection var ansible_shell_type to sh 30582 1726855270.27795: variable 'ansible_shell_executable' from source: unknown 30582 1726855270.27809: variable 'ansible_connection' from source: unknown 30582 1726855270.27813: variable 'ansible_module_compression' from source: unknown 30582 1726855270.27815: variable 'ansible_shell_type' from source: unknown 30582 1726855270.27818: variable 'ansible_shell_executable' from source: unknown 30582 1726855270.27820: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855270.27994: variable 'ansible_pipelining' from source: unknown 30582 1726855270.27997: variable 'ansible_timeout' from source: unknown 30582 1726855270.28000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855270.28059: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855270.28069: variable 'omit' from source: magic vars 30582 1726855270.28075: starting attempt loop 30582 1726855270.28078: running the handler 30582 1726855270.28095: _low_level_execute_command(): starting 30582 1726855270.28108: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855270.28891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855270.28952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855270.28986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855270.29082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855270.30764: stdout chunk (state=3): >>>/root <<< 30582 1726855270.30861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855270.30889: stderr chunk (state=3): >>><<< 30582 1726855270.30893: stdout chunk (state=3): >>><<< 30582 1726855270.30913: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855270.30925: _low_level_execute_command(): starting 30582 1726855270.30932: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905 `" && echo ansible-tmp-1726855270.3091443-30944-119039128218905="` echo /root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905 `" ) && sleep 0' 30582 1726855270.31361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855270.31364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855270.31367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855270.31369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855270.31378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855270.31426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855270.31430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855270.31432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855270.31498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855270.33386: stdout chunk (state=3): >>>ansible-tmp-1726855270.3091443-30944-119039128218905=/root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905 <<< 30582 1726855270.33499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855270.33522: stderr chunk (state=3): >>><<< 30582 1726855270.33525: stdout chunk (state=3): >>><<< 30582 1726855270.33539: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855270.3091443-30944-119039128218905=/root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855270.33576: variable 'ansible_module_compression' from source: unknown 30582 1726855270.33614: ANSIBALLZ: Using lock for service_facts 30582 1726855270.33617: ANSIBALLZ: Acquiring lock 30582 1726855270.33619: ANSIBALLZ: Lock acquired: 140270805628880 30582 1726855270.33622: ANSIBALLZ: Creating module 30582 1726855270.41942: ANSIBALLZ: Writing module into payload 30582 1726855270.42008: ANSIBALLZ: Writing module 30582 1726855270.42025: ANSIBALLZ: Renaming module 30582 1726855270.42036: ANSIBALLZ: Done creating module 30582 1726855270.42050: variable 'ansible_facts' from source: unknown 30582 1726855270.42102: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/AnsiballZ_service_facts.py 30582 1726855270.42208: Sending initial data 30582 1726855270.42211: Sent initial data (162 bytes) 30582 1726855270.42675: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855270.42678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855270.42680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855270.42684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855270.42686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855270.42742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855270.42745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855270.42747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855270.42824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855270.44460: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855270.44465: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855270.44517: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855270.44580: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmptd4pmxa0 /root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/AnsiballZ_service_facts.py <<< 30582 1726855270.44583: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/AnsiballZ_service_facts.py" <<< 30582 1726855270.44639: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmptd4pmxa0" to remote "/root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/AnsiballZ_service_facts.py" <<< 30582 1726855270.44643: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/AnsiballZ_service_facts.py" <<< 30582 1726855270.45252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855270.45299: stderr chunk (state=3): >>><<< 30582 1726855270.45302: stdout chunk (state=3): >>><<< 30582 1726855270.45357: done transferring module to remote 30582 1726855270.45366: _low_level_execute_command(): starting 30582 1726855270.45371: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/ /root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/AnsiballZ_service_facts.py && sleep 0' 30582 1726855270.45823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855270.45826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855270.45828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855270.45833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855270.45838: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855270.45841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855270.45883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855270.45889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855270.45952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855270.47714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855270.47739: stderr chunk (state=3): >>><<< 30582 1726855270.47742: stdout chunk (state=3): >>><<< 30582 1726855270.47753: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855270.47756: _low_level_execute_command(): starting 30582 1726855270.47761: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/AnsiballZ_service_facts.py && sleep 0' 30582 1726855270.48169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855270.48178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855270.48209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855270.48213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855270.48215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855270.48261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855270.48264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855270.48334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855272.02002: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30582 1726855272.02071: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "syste<<< 30582 1726855272.02092: stdout chunk (state=3): >>>md-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855272.03623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855272.03627: stdout chunk (state=3): >>><<< 30582 1726855272.03629: stderr chunk (state=3): >>><<< 30582 1726855272.03796: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855272.04368: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855272.04393: _low_level_execute_command(): starting 30582 1726855272.04406: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855270.3091443-30944-119039128218905/ > /dev/null 2>&1 && sleep 0' 30582 1726855272.05078: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855272.05098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855272.05119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855272.05143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855272.05204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855272.05245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855272.05259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855272.05278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855272.05384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855272.07239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855272.07261: stdout chunk (state=3): >>><<< 30582 1726855272.07264: stderr chunk (state=3): >>><<< 30582 1726855272.07493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855272.07497: handler run complete 30582 1726855272.07499: variable 'ansible_facts' from source: unknown 30582 1726855272.07627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855272.08034: variable 'ansible_facts' from source: unknown 30582 1726855272.08162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855272.08370: attempt loop complete, returning result 30582 1726855272.08382: _execute() done 30582 1726855272.08392: dumping result to json 30582 1726855272.08453: done dumping result, returning 30582 1726855272.08467: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-00000000026e] 30582 1726855272.08478: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000026e 30582 1726855272.09386: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000026e 30582 1726855272.09392: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855272.09444: no more pending results, returning what we have 30582 1726855272.09447: results queue empty 30582 1726855272.09568: checking for any_errors_fatal 30582 1726855272.09574: done checking for any_errors_fatal 30582 1726855272.09575: checking for max_fail_percentage 30582 1726855272.09577: done checking for max_fail_percentage 30582 1726855272.09578: checking to see if all hosts have failed and the running result is not ok 30582 1726855272.09578: done checking to see if all hosts have failed 30582 1726855272.09579: getting the remaining hosts for this loop 30582 1726855272.09580: done getting the remaining hosts for this loop 30582 1726855272.09584: getting the next task for host managed_node3 30582 1726855272.09593: done getting next task for host managed_node3 30582 1726855272.09597: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855272.09603: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855272.09613: getting variables 30582 1726855272.09615: in VariableManager get_vars() 30582 1726855272.09644: Calling all_inventory to load vars for managed_node3 30582 1726855272.09647: Calling groups_inventory to load vars for managed_node3 30582 1726855272.09649: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855272.09658: Calling all_plugins_play to load vars for managed_node3 30582 1726855272.09661: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855272.09669: Calling groups_plugins_play to load vars for managed_node3 30582 1726855272.10057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855272.10548: done with get_vars() 30582 1726855272.10561: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:01:12 -0400 (0:00:01.847) 0:00:08.456 ****** 30582 1726855272.10651: entering _queue_task() for managed_node3/package_facts 30582 1726855272.10653: Creating lock for package_facts 30582 1726855272.10935: worker is 1 (out of 1 available) 30582 1726855272.10946: exiting _queue_task() for managed_node3/package_facts 30582 1726855272.10956: done queuing things up, now waiting for results queue to drain 30582 1726855272.10957: waiting for pending results... 30582 1726855272.11218: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855272.11379: in run() - task 0affcc66-ac2b-aa83-7d57-00000000026f 30582 1726855272.11407: variable 'ansible_search_path' from source: unknown 30582 1726855272.11418: variable 'ansible_search_path' from source: unknown 30582 1726855272.11454: calling self._execute() 30582 1726855272.11539: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855272.11550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855272.11592: variable 'omit' from source: magic vars 30582 1726855272.11919: variable 'ansible_distribution_major_version' from source: facts 30582 1726855272.11938: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855272.11953: variable 'omit' from source: magic vars 30582 1726855272.12032: variable 'omit' from source: magic vars 30582 1726855272.12176: variable 'omit' from source: magic vars 30582 1726855272.12179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855272.12182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855272.12184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855272.12192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855272.12208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855272.12241: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855272.12249: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855272.12255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855272.12365: Set connection var ansible_timeout to 10 30582 1726855272.12372: Set connection var ansible_connection to ssh 30582 1726855272.12384: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855272.12395: Set connection var ansible_pipelining to False 30582 1726855272.12413: Set connection var ansible_shell_executable to /bin/sh 30582 1726855272.12421: Set connection var ansible_shell_type to sh 30582 1726855272.12445: variable 'ansible_shell_executable' from source: unknown 30582 1726855272.12453: variable 'ansible_connection' from source: unknown 30582 1726855272.12460: variable 'ansible_module_compression' from source: unknown 30582 1726855272.12466: variable 'ansible_shell_type' from source: unknown 30582 1726855272.12472: variable 'ansible_shell_executable' from source: unknown 30582 1726855272.12514: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855272.12517: variable 'ansible_pipelining' from source: unknown 30582 1726855272.12519: variable 'ansible_timeout' from source: unknown 30582 1726855272.12521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855272.12686: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855272.12704: variable 'omit' from source: magic vars 30582 1726855272.12713: starting attempt loop 30582 1726855272.12720: running the handler 30582 1726855272.12739: _low_level_execute_command(): starting 30582 1726855272.12792: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855272.13446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855272.13463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855272.13507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855272.13596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855272.13637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855272.13725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855272.15407: stdout chunk (state=3): >>>/root <<< 30582 1726855272.15524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855272.15528: stdout chunk (state=3): >>><<< 30582 1726855272.15530: stderr chunk (state=3): >>><<< 30582 1726855272.15546: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855272.15563: _low_level_execute_command(): starting 30582 1726855272.15574: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670 `" && echo ansible-tmp-1726855272.1555197-30986-268409157312670="` echo /root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670 `" ) && sleep 0' 30582 1726855272.16166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855272.16180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855272.16199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855272.16218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855272.16234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855272.16245: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855272.16268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855272.16295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855272.16309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855272.16319: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855272.16330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855272.16413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855272.16430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855272.16467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855272.16522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855272.18425: stdout chunk (state=3): >>>ansible-tmp-1726855272.1555197-30986-268409157312670=/root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670 <<< 30582 1726855272.18576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855272.18589: stdout chunk (state=3): >>><<< 30582 1726855272.18606: stderr chunk (state=3): >>><<< 30582 1726855272.18625: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855272.1555197-30986-268409157312670=/root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855272.18684: variable 'ansible_module_compression' from source: unknown 30582 1726855272.18750: ANSIBALLZ: Using lock for package_facts 30582 1726855272.18758: ANSIBALLZ: Acquiring lock 30582 1726855272.18766: ANSIBALLZ: Lock acquired: 140270805746608 30582 1726855272.18775: ANSIBALLZ: Creating module 30582 1726855272.51968: ANSIBALLZ: Writing module into payload 30582 1726855272.52136: ANSIBALLZ: Writing module 30582 1726855272.52173: ANSIBALLZ: Renaming module 30582 1726855272.52190: ANSIBALLZ: Done creating module 30582 1726855272.52248: variable 'ansible_facts' from source: unknown 30582 1726855272.52451: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/AnsiballZ_package_facts.py 30582 1726855272.52696: Sending initial data 30582 1726855272.52699: Sent initial data (162 bytes) 30582 1726855272.53160: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855272.53173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855272.53202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855272.53253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855272.53256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855272.53282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855272.53377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855272.55029: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855272.55119: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855272.55200: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp03r4omp2 /root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/AnsiballZ_package_facts.py <<< 30582 1726855272.55203: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/AnsiballZ_package_facts.py" <<< 30582 1726855272.55259: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp03r4omp2" to remote "/root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/AnsiballZ_package_facts.py" <<< 30582 1726855272.56797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855272.56893: stderr chunk (state=3): >>><<< 30582 1726855272.56903: stdout chunk (state=3): >>><<< 30582 1726855272.56906: done transferring module to remote 30582 1726855272.56921: _low_level_execute_command(): starting 30582 1726855272.56930: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/ /root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/AnsiballZ_package_facts.py && sleep 0' 30582 1726855272.57595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855272.57610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855272.57626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855272.57676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855272.57755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855272.57803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855272.57889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855272.59827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855272.59831: stderr chunk (state=3): >>><<< 30582 1726855272.59834: stdout chunk (state=3): >>><<< 30582 1726855272.59836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855272.59838: _low_level_execute_command(): starting 30582 1726855272.59841: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/AnsiballZ_package_facts.py && sleep 0' 30582 1726855272.60393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855272.60410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855272.60432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855272.60448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855272.60463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855272.60473: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855272.60485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855272.60539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855272.60597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855272.60613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855272.60636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855272.60732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855273.04719: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30582 1726855273.04763: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30582 1726855273.04769: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30582 1726855273.04773: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855273.06423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855273.06426: stdout chunk (state=3): >>><<< 30582 1726855273.06428: stderr chunk (state=3): >>><<< 30582 1726855273.06500: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855273.09966: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855273.10093: _low_level_execute_command(): starting 30582 1726855273.10098: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855272.1555197-30986-268409157312670/ > /dev/null 2>&1 && sleep 0' 30582 1726855273.10695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855273.10715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855273.10741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855273.10843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855273.10864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855273.10880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855273.10977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855273.12872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855273.12899: stderr chunk (state=3): >>><<< 30582 1726855273.12910: stdout chunk (state=3): >>><<< 30582 1726855273.12932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855273.12948: handler run complete 30582 1726855273.13853: variable 'ansible_facts' from source: unknown 30582 1726855273.14223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.15826: variable 'ansible_facts' from source: unknown 30582 1726855273.16058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.16441: attempt loop complete, returning result 30582 1726855273.16456: _execute() done 30582 1726855273.16459: dumping result to json 30582 1726855273.16573: done dumping result, returning 30582 1726855273.16581: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-00000000026f] 30582 1726855273.16586: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000026f 30582 1726855273.18226: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000026f 30582 1726855273.18230: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855273.18329: no more pending results, returning what we have 30582 1726855273.18332: results queue empty 30582 1726855273.18333: checking for any_errors_fatal 30582 1726855273.18337: done checking for any_errors_fatal 30582 1726855273.18337: checking for max_fail_percentage 30582 1726855273.18339: done checking for max_fail_percentage 30582 1726855273.18340: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.18340: done checking to see if all hosts have failed 30582 1726855273.18341: getting the remaining hosts for this loop 30582 1726855273.18342: done getting the remaining hosts for this loop 30582 1726855273.18346: getting the next task for host managed_node3 30582 1726855273.18360: done getting next task for host managed_node3 30582 1726855273.18365: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855273.18371: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.18380: getting variables 30582 1726855273.18381: in VariableManager get_vars() 30582 1726855273.18410: Calling all_inventory to load vars for managed_node3 30582 1726855273.18413: Calling groups_inventory to load vars for managed_node3 30582 1726855273.18415: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.18424: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.18427: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.18430: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.19293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.20160: done with get_vars() 30582 1726855273.20177: done getting variables 30582 1726855273.20226: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:01:13 -0400 (0:00:01.096) 0:00:09.552 ****** 30582 1726855273.20256: entering _queue_task() for managed_node3/debug 30582 1726855273.20500: worker is 1 (out of 1 available) 30582 1726855273.20517: exiting _queue_task() for managed_node3/debug 30582 1726855273.20528: done queuing things up, now waiting for results queue to drain 30582 1726855273.20530: waiting for pending results... 30582 1726855273.20701: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855273.20789: in run() - task 0affcc66-ac2b-aa83-7d57-00000000020d 30582 1726855273.20803: variable 'ansible_search_path' from source: unknown 30582 1726855273.20807: variable 'ansible_search_path' from source: unknown 30582 1726855273.20835: calling self._execute() 30582 1726855273.20899: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.20905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.20914: variable 'omit' from source: magic vars 30582 1726855273.21180: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.21196: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.21199: variable 'omit' from source: magic vars 30582 1726855273.21240: variable 'omit' from source: magic vars 30582 1726855273.21312: variable 'network_provider' from source: set_fact 30582 1726855273.21325: variable 'omit' from source: magic vars 30582 1726855273.21356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855273.21381: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855273.21401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855273.21416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855273.21427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855273.21450: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855273.21453: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.21456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.21532: Set connection var ansible_timeout to 10 30582 1726855273.21535: Set connection var ansible_connection to ssh 30582 1726855273.21538: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855273.21593: Set connection var ansible_pipelining to False 30582 1726855273.21597: Set connection var ansible_shell_executable to /bin/sh 30582 1726855273.21600: Set connection var ansible_shell_type to sh 30582 1726855273.21602: variable 'ansible_shell_executable' from source: unknown 30582 1726855273.21605: variable 'ansible_connection' from source: unknown 30582 1726855273.21607: variable 'ansible_module_compression' from source: unknown 30582 1726855273.21609: variable 'ansible_shell_type' from source: unknown 30582 1726855273.21611: variable 'ansible_shell_executable' from source: unknown 30582 1726855273.21613: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.21615: variable 'ansible_pipelining' from source: unknown 30582 1726855273.21617: variable 'ansible_timeout' from source: unknown 30582 1726855273.21619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.21682: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855273.21690: variable 'omit' from source: magic vars 30582 1726855273.21698: starting attempt loop 30582 1726855273.21701: running the handler 30582 1726855273.21739: handler run complete 30582 1726855273.21746: attempt loop complete, returning result 30582 1726855273.21748: _execute() done 30582 1726855273.21751: dumping result to json 30582 1726855273.21753: done dumping result, returning 30582 1726855273.21763: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-00000000020d] 30582 1726855273.21767: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000020d 30582 1726855273.21842: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000020d 30582 1726855273.21845: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855273.21904: no more pending results, returning what we have 30582 1726855273.21908: results queue empty 30582 1726855273.21909: checking for any_errors_fatal 30582 1726855273.21917: done checking for any_errors_fatal 30582 1726855273.21917: checking for max_fail_percentage 30582 1726855273.21919: done checking for max_fail_percentage 30582 1726855273.21920: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.21920: done checking to see if all hosts have failed 30582 1726855273.21921: getting the remaining hosts for this loop 30582 1726855273.21923: done getting the remaining hosts for this loop 30582 1726855273.21926: getting the next task for host managed_node3 30582 1726855273.21934: done getting next task for host managed_node3 30582 1726855273.21937: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855273.21942: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.21953: getting variables 30582 1726855273.21954: in VariableManager get_vars() 30582 1726855273.21985: Calling all_inventory to load vars for managed_node3 30582 1726855273.21990: Calling groups_inventory to load vars for managed_node3 30582 1726855273.21992: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.22001: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.22004: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.22006: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.22766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.23737: done with get_vars() 30582 1726855273.23754: done getting variables 30582 1726855273.23826: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:01:13 -0400 (0:00:00.035) 0:00:09.588 ****** 30582 1726855273.23855: entering _queue_task() for managed_node3/fail 30582 1726855273.23856: Creating lock for fail 30582 1726855273.24099: worker is 1 (out of 1 available) 30582 1726855273.24114: exiting _queue_task() for managed_node3/fail 30582 1726855273.24125: done queuing things up, now waiting for results queue to drain 30582 1726855273.24127: waiting for pending results... 30582 1726855273.24304: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855273.24389: in run() - task 0affcc66-ac2b-aa83-7d57-00000000020e 30582 1726855273.24402: variable 'ansible_search_path' from source: unknown 30582 1726855273.24406: variable 'ansible_search_path' from source: unknown 30582 1726855273.24434: calling self._execute() 30582 1726855273.24501: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.24505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.24513: variable 'omit' from source: magic vars 30582 1726855273.24774: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.24784: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.24870: variable 'network_state' from source: role '' defaults 30582 1726855273.24878: Evaluated conditional (network_state != {}): False 30582 1726855273.24882: when evaluation is False, skipping this task 30582 1726855273.24885: _execute() done 30582 1726855273.24889: dumping result to json 30582 1726855273.24900: done dumping result, returning 30582 1726855273.24906: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-00000000020e] 30582 1726855273.24910: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000020e 30582 1726855273.24992: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000020e 30582 1726855273.24995: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855273.25064: no more pending results, returning what we have 30582 1726855273.25068: results queue empty 30582 1726855273.25069: checking for any_errors_fatal 30582 1726855273.25076: done checking for any_errors_fatal 30582 1726855273.25077: checking for max_fail_percentage 30582 1726855273.25079: done checking for max_fail_percentage 30582 1726855273.25080: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.25081: done checking to see if all hosts have failed 30582 1726855273.25081: getting the remaining hosts for this loop 30582 1726855273.25083: done getting the remaining hosts for this loop 30582 1726855273.25086: getting the next task for host managed_node3 30582 1726855273.25096: done getting next task for host managed_node3 30582 1726855273.25099: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855273.25105: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.25120: getting variables 30582 1726855273.25122: in VariableManager get_vars() 30582 1726855273.25150: Calling all_inventory to load vars for managed_node3 30582 1726855273.25153: Calling groups_inventory to load vars for managed_node3 30582 1726855273.25155: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.25163: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.25165: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.25168: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.25915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.26779: done with get_vars() 30582 1726855273.26800: done getting variables 30582 1726855273.26845: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:01:13 -0400 (0:00:00.030) 0:00:09.618 ****** 30582 1726855273.26870: entering _queue_task() for managed_node3/fail 30582 1726855273.27111: worker is 1 (out of 1 available) 30582 1726855273.27127: exiting _queue_task() for managed_node3/fail 30582 1726855273.27139: done queuing things up, now waiting for results queue to drain 30582 1726855273.27141: waiting for pending results... 30582 1726855273.27319: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855273.27411: in run() - task 0affcc66-ac2b-aa83-7d57-00000000020f 30582 1726855273.27421: variable 'ansible_search_path' from source: unknown 30582 1726855273.27424: variable 'ansible_search_path' from source: unknown 30582 1726855273.27452: calling self._execute() 30582 1726855273.27521: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.27524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.27533: variable 'omit' from source: magic vars 30582 1726855273.27800: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.27812: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.27891: variable 'network_state' from source: role '' defaults 30582 1726855273.27902: Evaluated conditional (network_state != {}): False 30582 1726855273.27905: when evaluation is False, skipping this task 30582 1726855273.27908: _execute() done 30582 1726855273.27913: dumping result to json 30582 1726855273.27916: done dumping result, returning 30582 1726855273.27926: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-00000000020f] 30582 1726855273.27929: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000020f 30582 1726855273.28010: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000020f 30582 1726855273.28013: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855273.28073: no more pending results, returning what we have 30582 1726855273.28076: results queue empty 30582 1726855273.28077: checking for any_errors_fatal 30582 1726855273.28084: done checking for any_errors_fatal 30582 1726855273.28084: checking for max_fail_percentage 30582 1726855273.28086: done checking for max_fail_percentage 30582 1726855273.28089: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.28090: done checking to see if all hosts have failed 30582 1726855273.28090: getting the remaining hosts for this loop 30582 1726855273.28092: done getting the remaining hosts for this loop 30582 1726855273.28096: getting the next task for host managed_node3 30582 1726855273.28104: done getting next task for host managed_node3 30582 1726855273.28108: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855273.28114: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.28129: getting variables 30582 1726855273.28130: in VariableManager get_vars() 30582 1726855273.28162: Calling all_inventory to load vars for managed_node3 30582 1726855273.28164: Calling groups_inventory to load vars for managed_node3 30582 1726855273.28166: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.28174: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.28176: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.28179: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.29044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.29894: done with get_vars() 30582 1726855273.29912: done getting variables 30582 1726855273.29955: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:01:13 -0400 (0:00:00.031) 0:00:09.649 ****** 30582 1726855273.29978: entering _queue_task() for managed_node3/fail 30582 1726855273.30211: worker is 1 (out of 1 available) 30582 1726855273.30226: exiting _queue_task() for managed_node3/fail 30582 1726855273.30238: done queuing things up, now waiting for results queue to drain 30582 1726855273.30241: waiting for pending results... 30582 1726855273.30419: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855273.30512: in run() - task 0affcc66-ac2b-aa83-7d57-000000000210 30582 1726855273.30522: variable 'ansible_search_path' from source: unknown 30582 1726855273.30526: variable 'ansible_search_path' from source: unknown 30582 1726855273.30554: calling self._execute() 30582 1726855273.30619: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.30623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.30633: variable 'omit' from source: magic vars 30582 1726855273.30901: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.30914: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.31032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855273.32529: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855273.32584: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855273.32616: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855273.32642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855273.32664: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855273.32726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.32746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.32765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.32797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.32809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.32881: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.32897: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855273.32972: variable 'ansible_distribution' from source: facts 30582 1726855273.32977: variable '__network_rh_distros' from source: role '' defaults 30582 1726855273.32989: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855273.33140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.33157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.33174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.33208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.33218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.33251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.33266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.33282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.33315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.33322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.33350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.33366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.33381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.33409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.33425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.33611: variable 'network_connections' from source: include params 30582 1726855273.33620: variable 'interface' from source: play vars 30582 1726855273.33671: variable 'interface' from source: play vars 30582 1726855273.33681: variable 'network_state' from source: role '' defaults 30582 1726855273.33727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855273.33837: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855273.33865: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855273.33892: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855273.33912: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855273.33942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855273.33958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855273.33982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.34003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855273.34029: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855273.34032: when evaluation is False, skipping this task 30582 1726855273.34035: _execute() done 30582 1726855273.34037: dumping result to json 30582 1726855273.34040: done dumping result, returning 30582 1726855273.34047: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-000000000210] 30582 1726855273.34052: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000210 30582 1726855273.34137: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000210 30582 1726855273.34139: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855273.34181: no more pending results, returning what we have 30582 1726855273.34185: results queue empty 30582 1726855273.34186: checking for any_errors_fatal 30582 1726855273.34197: done checking for any_errors_fatal 30582 1726855273.34198: checking for max_fail_percentage 30582 1726855273.34200: done checking for max_fail_percentage 30582 1726855273.34200: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.34201: done checking to see if all hosts have failed 30582 1726855273.34202: getting the remaining hosts for this loop 30582 1726855273.34203: done getting the remaining hosts for this loop 30582 1726855273.34207: getting the next task for host managed_node3 30582 1726855273.34214: done getting next task for host managed_node3 30582 1726855273.34217: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855273.34222: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.34235: getting variables 30582 1726855273.34237: in VariableManager get_vars() 30582 1726855273.34271: Calling all_inventory to load vars for managed_node3 30582 1726855273.34274: Calling groups_inventory to load vars for managed_node3 30582 1726855273.34276: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.34286: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.34293: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.34296: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.35085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.35957: done with get_vars() 30582 1726855273.35973: done getting variables 30582 1726855273.36047: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:01:13 -0400 (0:00:00.060) 0:00:09.710 ****** 30582 1726855273.36069: entering _queue_task() for managed_node3/dnf 30582 1726855273.36292: worker is 1 (out of 1 available) 30582 1726855273.36307: exiting _queue_task() for managed_node3/dnf 30582 1726855273.36319: done queuing things up, now waiting for results queue to drain 30582 1726855273.36321: waiting for pending results... 30582 1726855273.36489: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855273.36569: in run() - task 0affcc66-ac2b-aa83-7d57-000000000211 30582 1726855273.36580: variable 'ansible_search_path' from source: unknown 30582 1726855273.36584: variable 'ansible_search_path' from source: unknown 30582 1726855273.36613: calling self._execute() 30582 1726855273.36671: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.36674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.36684: variable 'omit' from source: magic vars 30582 1726855273.36952: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.36961: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.37101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855273.38805: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855273.38849: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855273.38879: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855273.38919: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855273.38939: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855273.39000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.39020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.39039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.39066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.39078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.39160: variable 'ansible_distribution' from source: facts 30582 1726855273.39163: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.39176: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855273.39257: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855273.39343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.39359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.39376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.39408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.39419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.39445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.39460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.39477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.39512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.39518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.39545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.39560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.39576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.39605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.39618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.39710: variable 'network_connections' from source: include params 30582 1726855273.39720: variable 'interface' from source: play vars 30582 1726855273.39769: variable 'interface' from source: play vars 30582 1726855273.39821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855273.39935: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855273.39963: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855273.39985: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855273.40011: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855273.40041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855273.40059: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855273.40080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.40102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855273.40145: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855273.40298: variable 'network_connections' from source: include params 30582 1726855273.40301: variable 'interface' from source: play vars 30582 1726855273.40345: variable 'interface' from source: play vars 30582 1726855273.40369: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855273.40372: when evaluation is False, skipping this task 30582 1726855273.40376: _execute() done 30582 1726855273.40379: dumping result to json 30582 1726855273.40381: done dumping result, returning 30582 1726855273.40391: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000211] 30582 1726855273.40398: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000211 30582 1726855273.40482: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000211 30582 1726855273.40485: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855273.40533: no more pending results, returning what we have 30582 1726855273.40537: results queue empty 30582 1726855273.40538: checking for any_errors_fatal 30582 1726855273.40544: done checking for any_errors_fatal 30582 1726855273.40544: checking for max_fail_percentage 30582 1726855273.40546: done checking for max_fail_percentage 30582 1726855273.40547: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.40548: done checking to see if all hosts have failed 30582 1726855273.40548: getting the remaining hosts for this loop 30582 1726855273.40550: done getting the remaining hosts for this loop 30582 1726855273.40553: getting the next task for host managed_node3 30582 1726855273.40561: done getting next task for host managed_node3 30582 1726855273.40564: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855273.40569: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.40582: getting variables 30582 1726855273.40584: in VariableManager get_vars() 30582 1726855273.40620: Calling all_inventory to load vars for managed_node3 30582 1726855273.40623: Calling groups_inventory to load vars for managed_node3 30582 1726855273.40625: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.40636: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.40638: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.40640: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.41538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.42915: done with get_vars() 30582 1726855273.42940: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855273.43021: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:01:13 -0400 (0:00:00.069) 0:00:09.780 ****** 30582 1726855273.43055: entering _queue_task() for managed_node3/yum 30582 1726855273.43057: Creating lock for yum 30582 1726855273.43395: worker is 1 (out of 1 available) 30582 1726855273.43409: exiting _queue_task() for managed_node3/yum 30582 1726855273.43421: done queuing things up, now waiting for results queue to drain 30582 1726855273.43423: waiting for pending results... 30582 1726855273.43808: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855273.43852: in run() - task 0affcc66-ac2b-aa83-7d57-000000000212 30582 1726855273.43895: variable 'ansible_search_path' from source: unknown 30582 1726855273.43900: variable 'ansible_search_path' from source: unknown 30582 1726855273.44036: calling self._execute() 30582 1726855273.44040: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.44043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.44052: variable 'omit' from source: magic vars 30582 1726855273.44436: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.44455: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.44639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855273.46784: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855273.46843: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855273.46872: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855273.46901: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855273.46921: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855273.46981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.47004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.47022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.47049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.47060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.47133: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.47146: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855273.47149: when evaluation is False, skipping this task 30582 1726855273.47153: _execute() done 30582 1726855273.47155: dumping result to json 30582 1726855273.47158: done dumping result, returning 30582 1726855273.47166: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000212] 30582 1726855273.47170: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000212 30582 1726855273.47258: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000212 30582 1726855273.47260: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855273.47338: no more pending results, returning what we have 30582 1726855273.47341: results queue empty 30582 1726855273.47342: checking for any_errors_fatal 30582 1726855273.47348: done checking for any_errors_fatal 30582 1726855273.47348: checking for max_fail_percentage 30582 1726855273.47351: done checking for max_fail_percentage 30582 1726855273.47351: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.47352: done checking to see if all hosts have failed 30582 1726855273.47353: getting the remaining hosts for this loop 30582 1726855273.47354: done getting the remaining hosts for this loop 30582 1726855273.47358: getting the next task for host managed_node3 30582 1726855273.47366: done getting next task for host managed_node3 30582 1726855273.47369: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855273.47374: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.47394: getting variables 30582 1726855273.47396: in VariableManager get_vars() 30582 1726855273.47429: Calling all_inventory to load vars for managed_node3 30582 1726855273.47431: Calling groups_inventory to load vars for managed_node3 30582 1726855273.47433: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.47441: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.47444: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.47446: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.48271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.49226: done with get_vars() 30582 1726855273.49244: done getting variables 30582 1726855273.49286: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:01:13 -0400 (0:00:00.062) 0:00:09.843 ****** 30582 1726855273.49314: entering _queue_task() for managed_node3/fail 30582 1726855273.49552: worker is 1 (out of 1 available) 30582 1726855273.49565: exiting _queue_task() for managed_node3/fail 30582 1726855273.49575: done queuing things up, now waiting for results queue to drain 30582 1726855273.49577: waiting for pending results... 30582 1726855273.49908: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855273.49912: in run() - task 0affcc66-ac2b-aa83-7d57-000000000213 30582 1726855273.49915: variable 'ansible_search_path' from source: unknown 30582 1726855273.49918: variable 'ansible_search_path' from source: unknown 30582 1726855273.49952: calling self._execute() 30582 1726855273.50039: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.50050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.50064: variable 'omit' from source: magic vars 30582 1726855273.50434: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.50452: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.50568: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855273.50760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855273.52351: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855273.52410: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855273.52436: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855273.52463: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855273.52484: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855273.52545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.52566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.52586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.52617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.52628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.52661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.52676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.52701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.52725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.52735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.52763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.52778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.52801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.52825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.52835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.52981: variable 'network_connections' from source: include params 30582 1726855273.53194: variable 'interface' from source: play vars 30582 1726855273.53197: variable 'interface' from source: play vars 30582 1726855273.53199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855273.53317: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855273.53357: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855273.53405: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855273.53436: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855273.53483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855273.53510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855273.53534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.53560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855273.53625: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855273.53882: variable 'network_connections' from source: include params 30582 1726855273.53896: variable 'interface' from source: play vars 30582 1726855273.53954: variable 'interface' from source: play vars 30582 1726855273.53986: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855273.54000: when evaluation is False, skipping this task 30582 1726855273.54006: _execute() done 30582 1726855273.54011: dumping result to json 30582 1726855273.54017: done dumping result, returning 30582 1726855273.54027: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000213] 30582 1726855273.54034: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000213 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855273.54190: no more pending results, returning what we have 30582 1726855273.54194: results queue empty 30582 1726855273.54195: checking for any_errors_fatal 30582 1726855273.54201: done checking for any_errors_fatal 30582 1726855273.54202: checking for max_fail_percentage 30582 1726855273.54204: done checking for max_fail_percentage 30582 1726855273.54205: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.54205: done checking to see if all hosts have failed 30582 1726855273.54206: getting the remaining hosts for this loop 30582 1726855273.54207: done getting the remaining hosts for this loop 30582 1726855273.54211: getting the next task for host managed_node3 30582 1726855273.54220: done getting next task for host managed_node3 30582 1726855273.54223: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855273.54228: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.54241: getting variables 30582 1726855273.54243: in VariableManager get_vars() 30582 1726855273.54279: Calling all_inventory to load vars for managed_node3 30582 1726855273.54282: Calling groups_inventory to load vars for managed_node3 30582 1726855273.54284: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.54295: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.54298: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.54300: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.54927: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000213 30582 1726855273.54934: WORKER PROCESS EXITING 30582 1726855273.55723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.57396: done with get_vars() 30582 1726855273.57426: done getting variables 30582 1726855273.57486: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:01:13 -0400 (0:00:00.082) 0:00:09.925 ****** 30582 1726855273.57525: entering _queue_task() for managed_node3/package 30582 1726855273.58003: worker is 1 (out of 1 available) 30582 1726855273.58015: exiting _queue_task() for managed_node3/package 30582 1726855273.58025: done queuing things up, now waiting for results queue to drain 30582 1726855273.58026: waiting for pending results... 30582 1726855273.58266: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855273.58323: in run() - task 0affcc66-ac2b-aa83-7d57-000000000214 30582 1726855273.58345: variable 'ansible_search_path' from source: unknown 30582 1726855273.58365: variable 'ansible_search_path' from source: unknown 30582 1726855273.58461: calling self._execute() 30582 1726855273.58563: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.58581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.58605: variable 'omit' from source: magic vars 30582 1726855273.59047: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.59050: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.59239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855273.59528: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855273.59580: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855273.59624: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855273.59699: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855273.59786: variable 'network_packages' from source: role '' defaults 30582 1726855273.59909: variable '__network_provider_setup' from source: role '' defaults 30582 1726855273.59928: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855273.60005: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855273.60026: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855273.60093: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855273.60351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855273.62328: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855273.62394: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855273.62446: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855273.62499: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855273.62536: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855273.62615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.62694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.62698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.62735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.62761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.62810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.62837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.62872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.62964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.62967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.63179: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855273.63393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.63397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.63399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.63403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.63415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.63503: variable 'ansible_python' from source: facts 30582 1726855273.63692: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855273.63740: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855273.64297: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855273.64301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.64303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.64306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.64334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.64354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.64518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855273.64555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855273.64583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.64705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855273.64733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855273.64954: variable 'network_connections' from source: include params 30582 1726855273.64975: variable 'interface' from source: play vars 30582 1726855273.65085: variable 'interface' from source: play vars 30582 1726855273.65170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855273.65206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855273.65242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855273.65283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855273.65337: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855273.65645: variable 'network_connections' from source: include params 30582 1726855273.65655: variable 'interface' from source: play vars 30582 1726855273.65762: variable 'interface' from source: play vars 30582 1726855273.65829: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855273.65910: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855273.66229: variable 'network_connections' from source: include params 30582 1726855273.66240: variable 'interface' from source: play vars 30582 1726855273.66313: variable 'interface' from source: play vars 30582 1726855273.66342: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855273.66430: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855273.66757: variable 'network_connections' from source: include params 30582 1726855273.66767: variable 'interface' from source: play vars 30582 1726855273.66902: variable 'interface' from source: play vars 30582 1726855273.66970: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855273.67041: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855273.67052: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855273.67121: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855273.67351: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855273.67840: variable 'network_connections' from source: include params 30582 1726855273.67849: variable 'interface' from source: play vars 30582 1726855273.67913: variable 'interface' from source: play vars 30582 1726855273.67927: variable 'ansible_distribution' from source: facts 30582 1726855273.67936: variable '__network_rh_distros' from source: role '' defaults 30582 1726855273.67946: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.67980: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855273.68157: variable 'ansible_distribution' from source: facts 30582 1726855273.68166: variable '__network_rh_distros' from source: role '' defaults 30582 1726855273.68175: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.68190: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855273.68419: variable 'ansible_distribution' from source: facts 30582 1726855273.68424: variable '__network_rh_distros' from source: role '' defaults 30582 1726855273.68426: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.68428: variable 'network_provider' from source: set_fact 30582 1726855273.68528: variable 'ansible_facts' from source: unknown 30582 1726855273.73294: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855273.73307: when evaluation is False, skipping this task 30582 1726855273.73320: _execute() done 30582 1726855273.73326: dumping result to json 30582 1726855273.73333: done dumping result, returning 30582 1726855273.73343: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000000214] 30582 1726855273.73350: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000214 30582 1726855273.73526: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000214 30582 1726855273.73530: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855273.73574: no more pending results, returning what we have 30582 1726855273.73578: results queue empty 30582 1726855273.73579: checking for any_errors_fatal 30582 1726855273.73586: done checking for any_errors_fatal 30582 1726855273.73589: checking for max_fail_percentage 30582 1726855273.73591: done checking for max_fail_percentage 30582 1726855273.73592: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.73592: done checking to see if all hosts have failed 30582 1726855273.73593: getting the remaining hosts for this loop 30582 1726855273.73595: done getting the remaining hosts for this loop 30582 1726855273.73598: getting the next task for host managed_node3 30582 1726855273.73606: done getting next task for host managed_node3 30582 1726855273.73610: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855273.73615: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.73629: getting variables 30582 1726855273.73631: in VariableManager get_vars() 30582 1726855273.73669: Calling all_inventory to load vars for managed_node3 30582 1726855273.73672: Calling groups_inventory to load vars for managed_node3 30582 1726855273.73674: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.73683: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.73686: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.73899: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.75459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.82411: done with get_vars() 30582 1726855273.82436: done getting variables 30582 1726855273.82482: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:01:13 -0400 (0:00:00.249) 0:00:10.175 ****** 30582 1726855273.82515: entering _queue_task() for managed_node3/package 30582 1726855273.82847: worker is 1 (out of 1 available) 30582 1726855273.82858: exiting _queue_task() for managed_node3/package 30582 1726855273.82869: done queuing things up, now waiting for results queue to drain 30582 1726855273.82871: waiting for pending results... 30582 1726855273.83210: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855273.83306: in run() - task 0affcc66-ac2b-aa83-7d57-000000000215 30582 1726855273.83495: variable 'ansible_search_path' from source: unknown 30582 1726855273.83499: variable 'ansible_search_path' from source: unknown 30582 1726855273.83501: calling self._execute() 30582 1726855273.83504: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.83508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.83511: variable 'omit' from source: magic vars 30582 1726855273.83861: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.83882: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.84011: variable 'network_state' from source: role '' defaults 30582 1726855273.84026: Evaluated conditional (network_state != {}): False 30582 1726855273.84034: when evaluation is False, skipping this task 30582 1726855273.84040: _execute() done 30582 1726855273.84046: dumping result to json 30582 1726855273.84055: done dumping result, returning 30582 1726855273.84072: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000000215] 30582 1726855273.84083: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000215 30582 1726855273.84329: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000215 30582 1726855273.84332: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855273.84379: no more pending results, returning what we have 30582 1726855273.84383: results queue empty 30582 1726855273.84384: checking for any_errors_fatal 30582 1726855273.84395: done checking for any_errors_fatal 30582 1726855273.84396: checking for max_fail_percentage 30582 1726855273.84398: done checking for max_fail_percentage 30582 1726855273.84399: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.84400: done checking to see if all hosts have failed 30582 1726855273.84401: getting the remaining hosts for this loop 30582 1726855273.84402: done getting the remaining hosts for this loop 30582 1726855273.84406: getting the next task for host managed_node3 30582 1726855273.84414: done getting next task for host managed_node3 30582 1726855273.84418: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855273.84423: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.84441: getting variables 30582 1726855273.84443: in VariableManager get_vars() 30582 1726855273.84480: Calling all_inventory to load vars for managed_node3 30582 1726855273.84483: Calling groups_inventory to load vars for managed_node3 30582 1726855273.84485: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.84683: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.84688: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.84694: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.85912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.87767: done with get_vars() 30582 1726855273.87799: done getting variables 30582 1726855273.87873: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:01:13 -0400 (0:00:00.053) 0:00:10.229 ****** 30582 1726855273.87915: entering _queue_task() for managed_node3/package 30582 1726855273.88402: worker is 1 (out of 1 available) 30582 1726855273.88415: exiting _queue_task() for managed_node3/package 30582 1726855273.88426: done queuing things up, now waiting for results queue to drain 30582 1726855273.88428: waiting for pending results... 30582 1726855273.89014: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855273.89019: in run() - task 0affcc66-ac2b-aa83-7d57-000000000216 30582 1726855273.89022: variable 'ansible_search_path' from source: unknown 30582 1726855273.89025: variable 'ansible_search_path' from source: unknown 30582 1726855273.89143: calling self._execute() 30582 1726855273.89443: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.89450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.89461: variable 'omit' from source: magic vars 30582 1726855273.89882: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.89903: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.90030: variable 'network_state' from source: role '' defaults 30582 1726855273.90044: Evaluated conditional (network_state != {}): False 30582 1726855273.90051: when evaluation is False, skipping this task 30582 1726855273.90058: _execute() done 30582 1726855273.90065: dumping result to json 30582 1726855273.90072: done dumping result, returning 30582 1726855273.90084: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000000216] 30582 1726855273.90102: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000216 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855273.90258: no more pending results, returning what we have 30582 1726855273.90263: results queue empty 30582 1726855273.90264: checking for any_errors_fatal 30582 1726855273.90272: done checking for any_errors_fatal 30582 1726855273.90273: checking for max_fail_percentage 30582 1726855273.90276: done checking for max_fail_percentage 30582 1726855273.90278: checking to see if all hosts have failed and the running result is not ok 30582 1726855273.90279: done checking to see if all hosts have failed 30582 1726855273.90280: getting the remaining hosts for this loop 30582 1726855273.90281: done getting the remaining hosts for this loop 30582 1726855273.90286: getting the next task for host managed_node3 30582 1726855273.90297: done getting next task for host managed_node3 30582 1726855273.90301: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855273.90318: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855273.90498: getting variables 30582 1726855273.90501: in VariableManager get_vars() 30582 1726855273.90541: Calling all_inventory to load vars for managed_node3 30582 1726855273.90544: Calling groups_inventory to load vars for managed_node3 30582 1726855273.90546: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855273.90559: Calling all_plugins_play to load vars for managed_node3 30582 1726855273.90563: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855273.90566: Calling groups_plugins_play to load vars for managed_node3 30582 1726855273.91103: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000216 30582 1726855273.91107: WORKER PROCESS EXITING 30582 1726855273.92633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855273.95609: done with get_vars() 30582 1726855273.95637: done getting variables 30582 1726855273.95947: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:01:13 -0400 (0:00:00.080) 0:00:10.309 ****** 30582 1726855273.95984: entering _queue_task() for managed_node3/service 30582 1726855273.95986: Creating lock for service 30582 1726855273.96920: worker is 1 (out of 1 available) 30582 1726855273.96931: exiting _queue_task() for managed_node3/service 30582 1726855273.96941: done queuing things up, now waiting for results queue to drain 30582 1726855273.96942: waiting for pending results... 30582 1726855273.97147: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855273.97611: in run() - task 0affcc66-ac2b-aa83-7d57-000000000217 30582 1726855273.97614: variable 'ansible_search_path' from source: unknown 30582 1726855273.97617: variable 'ansible_search_path' from source: unknown 30582 1726855273.97619: calling self._execute() 30582 1726855273.97725: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855273.97735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855273.98008: variable 'omit' from source: magic vars 30582 1726855273.98596: variable 'ansible_distribution_major_version' from source: facts 30582 1726855273.98615: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855273.98856: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855273.99233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855274.01637: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855274.01722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855274.01763: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855274.01812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855274.01844: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855274.01932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855274.01964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855274.01997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.02048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855274.02067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855274.02124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855274.02152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855274.02181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.02234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855274.02254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855274.02302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855274.02337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855274.02366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.02409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855274.02439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855274.02658: variable 'network_connections' from source: include params 30582 1726855274.02661: variable 'interface' from source: play vars 30582 1726855274.02706: variable 'interface' from source: play vars 30582 1726855274.02789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855274.02962: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855274.03011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855274.03045: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855274.03079: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855274.03132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855274.03203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855274.03207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.03224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855274.03285: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855274.03547: variable 'network_connections' from source: include params 30582 1726855274.03557: variable 'interface' from source: play vars 30582 1726855274.03621: variable 'interface' from source: play vars 30582 1726855274.03666: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855274.03746: when evaluation is False, skipping this task 30582 1726855274.03750: _execute() done 30582 1726855274.03752: dumping result to json 30582 1726855274.03754: done dumping result, returning 30582 1726855274.03756: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000217] 30582 1726855274.03759: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000217 30582 1726855274.03835: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000217 30582 1726855274.03847: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855274.03895: no more pending results, returning what we have 30582 1726855274.03899: results queue empty 30582 1726855274.03900: checking for any_errors_fatal 30582 1726855274.03905: done checking for any_errors_fatal 30582 1726855274.03906: checking for max_fail_percentage 30582 1726855274.03908: done checking for max_fail_percentage 30582 1726855274.03909: checking to see if all hosts have failed and the running result is not ok 30582 1726855274.03910: done checking to see if all hosts have failed 30582 1726855274.03911: getting the remaining hosts for this loop 30582 1726855274.03912: done getting the remaining hosts for this loop 30582 1726855274.03917: getting the next task for host managed_node3 30582 1726855274.03925: done getting next task for host managed_node3 30582 1726855274.03929: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855274.03934: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855274.03949: getting variables 30582 1726855274.03951: in VariableManager get_vars() 30582 1726855274.03991: Calling all_inventory to load vars for managed_node3 30582 1726855274.03995: Calling groups_inventory to load vars for managed_node3 30582 1726855274.03998: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855274.04010: Calling all_plugins_play to load vars for managed_node3 30582 1726855274.04013: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855274.04016: Calling groups_plugins_play to load vars for managed_node3 30582 1726855274.07047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855274.10276: done with get_vars() 30582 1726855274.10310: done getting variables 30582 1726855274.10370: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:01:14 -0400 (0:00:00.146) 0:00:10.456 ****** 30582 1726855274.10610: entering _queue_task() for managed_node3/service 30582 1726855274.10962: worker is 1 (out of 1 available) 30582 1726855274.10975: exiting _queue_task() for managed_node3/service 30582 1726855274.11192: done queuing things up, now waiting for results queue to drain 30582 1726855274.11194: waiting for pending results... 30582 1726855274.11511: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855274.11517: in run() - task 0affcc66-ac2b-aa83-7d57-000000000218 30582 1726855274.11521: variable 'ansible_search_path' from source: unknown 30582 1726855274.11524: variable 'ansible_search_path' from source: unknown 30582 1726855274.11526: calling self._execute() 30582 1726855274.11621: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855274.11625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855274.11637: variable 'omit' from source: magic vars 30582 1726855274.12069: variable 'ansible_distribution_major_version' from source: facts 30582 1726855274.12095: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855274.12296: variable 'network_provider' from source: set_fact 30582 1726855274.12300: variable 'network_state' from source: role '' defaults 30582 1726855274.12306: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855274.12317: variable 'omit' from source: magic vars 30582 1726855274.12405: variable 'omit' from source: magic vars 30582 1726855274.12428: variable 'network_service_name' from source: role '' defaults 30582 1726855274.12692: variable 'network_service_name' from source: role '' defaults 30582 1726855274.12701: variable '__network_provider_setup' from source: role '' defaults 30582 1726855274.12703: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855274.12706: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855274.12724: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855274.12783: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855274.13018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855274.15776: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855274.15860: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855274.15910: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855274.15947: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855274.15982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855274.16066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855274.16110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855274.16139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.16182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855274.16212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855274.16312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855274.16315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855274.16317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.16360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855274.16380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855274.16626: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855274.16759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855274.16786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855274.16820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.16866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855274.16964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855274.16981: variable 'ansible_python' from source: facts 30582 1726855274.17009: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855274.17102: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855274.17191: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855274.17325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855274.17354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855274.17384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.17433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855274.17451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855274.17507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855274.17640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855274.17643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.17646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855274.17648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855274.17786: variable 'network_connections' from source: include params 30582 1726855274.17805: variable 'interface' from source: play vars 30582 1726855274.18094: variable 'interface' from source: play vars 30582 1726855274.18173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855274.18606: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855274.18896: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855274.18899: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855274.18935: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855274.19067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855274.19124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855274.19162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855274.19213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855274.19267: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855274.19550: variable 'network_connections' from source: include params 30582 1726855274.19560: variable 'interface' from source: play vars 30582 1726855274.19630: variable 'interface' from source: play vars 30582 1726855274.19680: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855274.19786: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855274.20096: variable 'network_connections' from source: include params 30582 1726855274.20101: variable 'interface' from source: play vars 30582 1726855274.20178: variable 'interface' from source: play vars 30582 1726855274.20205: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855274.20294: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855274.20602: variable 'network_connections' from source: include params 30582 1726855274.20605: variable 'interface' from source: play vars 30582 1726855274.20672: variable 'interface' from source: play vars 30582 1726855274.20745: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855274.20907: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855274.20914: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855274.21038: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855274.21371: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855274.22585: variable 'network_connections' from source: include params 30582 1726855274.22593: variable 'interface' from source: play vars 30582 1726855274.22654: variable 'interface' from source: play vars 30582 1726855274.22664: variable 'ansible_distribution' from source: facts 30582 1726855274.22667: variable '__network_rh_distros' from source: role '' defaults 30582 1726855274.22674: variable 'ansible_distribution_major_version' from source: facts 30582 1726855274.22740: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855274.23017: variable 'ansible_distribution' from source: facts 30582 1726855274.23020: variable '__network_rh_distros' from source: role '' defaults 30582 1726855274.23022: variable 'ansible_distribution_major_version' from source: facts 30582 1726855274.23025: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855274.23236: variable 'ansible_distribution' from source: facts 30582 1726855274.23240: variable '__network_rh_distros' from source: role '' defaults 30582 1726855274.23699: variable 'ansible_distribution_major_version' from source: facts 30582 1726855274.23702: variable 'network_provider' from source: set_fact 30582 1726855274.23704: variable 'omit' from source: magic vars 30582 1726855274.23706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855274.23709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855274.23712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855274.23714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855274.23716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855274.23718: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855274.23720: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855274.23722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855274.23724: Set connection var ansible_timeout to 10 30582 1726855274.23726: Set connection var ansible_connection to ssh 30582 1726855274.23728: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855274.23730: Set connection var ansible_pipelining to False 30582 1726855274.23732: Set connection var ansible_shell_executable to /bin/sh 30582 1726855274.23734: Set connection var ansible_shell_type to sh 30582 1726855274.23737: variable 'ansible_shell_executable' from source: unknown 30582 1726855274.23739: variable 'ansible_connection' from source: unknown 30582 1726855274.23740: variable 'ansible_module_compression' from source: unknown 30582 1726855274.23742: variable 'ansible_shell_type' from source: unknown 30582 1726855274.23744: variable 'ansible_shell_executable' from source: unknown 30582 1726855274.23746: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855274.23748: variable 'ansible_pipelining' from source: unknown 30582 1726855274.23749: variable 'ansible_timeout' from source: unknown 30582 1726855274.23751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855274.23754: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855274.23761: variable 'omit' from source: magic vars 30582 1726855274.23763: starting attempt loop 30582 1726855274.23765: running the handler 30582 1726855274.23843: variable 'ansible_facts' from source: unknown 30582 1726855274.24681: _low_level_execute_command(): starting 30582 1726855274.24693: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855274.25368: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855274.25408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855274.25436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855274.25497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855274.25525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855274.25619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855274.27398: stdout chunk (state=3): >>>/root <<< 30582 1726855274.27786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855274.27795: stdout chunk (state=3): >>><<< 30582 1726855274.27797: stderr chunk (state=3): >>><<< 30582 1726855274.27967: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855274.27984: _low_level_execute_command(): starting 30582 1726855274.28006: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247 `" && echo ansible-tmp-1726855274.2797368-31053-206089700058247="` echo /root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247 `" ) && sleep 0' 30582 1726855274.28644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855274.28703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855274.28774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855274.28813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855274.28816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855274.28920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855274.30915: stdout chunk (state=3): >>>ansible-tmp-1726855274.2797368-31053-206089700058247=/root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247 <<< 30582 1726855274.31193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855274.31197: stdout chunk (state=3): >>><<< 30582 1726855274.31199: stderr chunk (state=3): >>><<< 30582 1726855274.31202: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855274.2797368-31053-206089700058247=/root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855274.31204: variable 'ansible_module_compression' from source: unknown 30582 1726855274.31207: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 30582 1726855274.31210: ANSIBALLZ: Acquiring lock 30582 1726855274.31212: ANSIBALLZ: Lock acquired: 140270807060400 30582 1726855274.31214: ANSIBALLZ: Creating module 30582 1726855274.64251: ANSIBALLZ: Writing module into payload 30582 1726855274.64854: ANSIBALLZ: Writing module 30582 1726855274.64879: ANSIBALLZ: Renaming module 30582 1726855274.64885: ANSIBALLZ: Done creating module 30582 1726855274.65177: variable 'ansible_facts' from source: unknown 30582 1726855274.65346: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/AnsiballZ_systemd.py 30582 1726855274.65623: Sending initial data 30582 1726855274.65626: Sent initial data (156 bytes) 30582 1726855274.66346: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855274.66354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855274.66357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855274.66406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855274.66606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855274.68205: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855274.68258: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855274.68320: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpx38jo284 /root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/AnsiballZ_systemd.py <<< 30582 1726855274.68324: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/AnsiballZ_systemd.py" <<< 30582 1726855274.68384: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpx38jo284" to remote "/root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/AnsiballZ_systemd.py" <<< 30582 1726855274.71120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855274.71125: stdout chunk (state=3): >>><<< 30582 1726855274.71129: stderr chunk (state=3): >>><<< 30582 1726855274.71194: done transferring module to remote 30582 1726855274.71202: _low_level_execute_command(): starting 30582 1726855274.71207: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/ /root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/AnsiballZ_systemd.py && sleep 0' 30582 1726855274.72508: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855274.72581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855274.72628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855274.72750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855274.74855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855274.74860: stdout chunk (state=3): >>><<< 30582 1726855274.74862: stderr chunk (state=3): >>><<< 30582 1726855274.74865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855274.74867: _low_level_execute_command(): starting 30582 1726855274.74869: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/AnsiballZ_systemd.py && sleep 0' 30582 1726855274.75905: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855274.75998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855274.76002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855274.76004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855274.76007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855274.76195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855274.76222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855274.76405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855275.05727: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10575872", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314196480", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2007374000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855275.06048: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855275.07702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855275.07706: stdout chunk (state=3): >>><<< 30582 1726855275.07708: stderr chunk (state=3): >>><<< 30582 1726855275.07712: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10575872", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314196480", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2007374000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855275.07863: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855275.07883: _low_level_execute_command(): starting 30582 1726855275.07890: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855274.2797368-31053-206089700058247/ > /dev/null 2>&1 && sleep 0' 30582 1726855275.08483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855275.08496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855275.08507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855275.08521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855275.08569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855275.08572: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855275.08575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855275.08577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855275.08579: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855275.08582: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855275.08584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855275.08586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855275.08605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855275.08613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855275.08677: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855275.08680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855275.08700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855275.08712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855275.08734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855275.08947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855275.10796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855275.10860: stderr chunk (state=3): >>><<< 30582 1726855275.10880: stdout chunk (state=3): >>><<< 30582 1726855275.10908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855275.10929: handler run complete 30582 1726855275.11103: attempt loop complete, returning result 30582 1726855275.11106: _execute() done 30582 1726855275.11108: dumping result to json 30582 1726855275.11110: done dumping result, returning 30582 1726855275.11113: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-000000000218] 30582 1726855275.11115: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000218 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855275.11651: no more pending results, returning what we have 30582 1726855275.11655: results queue empty 30582 1726855275.11657: checking for any_errors_fatal 30582 1726855275.11663: done checking for any_errors_fatal 30582 1726855275.11664: checking for max_fail_percentage 30582 1726855275.11781: done checking for max_fail_percentage 30582 1726855275.11782: checking to see if all hosts have failed and the running result is not ok 30582 1726855275.11783: done checking to see if all hosts have failed 30582 1726855275.11784: getting the remaining hosts for this loop 30582 1726855275.11791: done getting the remaining hosts for this loop 30582 1726855275.11795: getting the next task for host managed_node3 30582 1726855275.11803: done getting next task for host managed_node3 30582 1726855275.11807: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855275.11812: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855275.11824: getting variables 30582 1726855275.11826: in VariableManager get_vars() 30582 1726855275.11859: Calling all_inventory to load vars for managed_node3 30582 1726855275.11862: Calling groups_inventory to load vars for managed_node3 30582 1726855275.11865: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855275.11876: Calling all_plugins_play to load vars for managed_node3 30582 1726855275.11880: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855275.11883: Calling groups_plugins_play to load vars for managed_node3 30582 1726855275.11904: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000218 30582 1726855275.12630: WORKER PROCESS EXITING 30582 1726855275.13554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855275.15474: done with get_vars() 30582 1726855275.15515: done getting variables 30582 1726855275.15595: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:01:15 -0400 (0:00:01.050) 0:00:11.506 ****** 30582 1726855275.15647: entering _queue_task() for managed_node3/service 30582 1726855275.16055: worker is 1 (out of 1 available) 30582 1726855275.16081: exiting _queue_task() for managed_node3/service 30582 1726855275.16095: done queuing things up, now waiting for results queue to drain 30582 1726855275.16097: waiting for pending results... 30582 1726855275.16375: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855275.16495: in run() - task 0affcc66-ac2b-aa83-7d57-000000000219 30582 1726855275.16518: variable 'ansible_search_path' from source: unknown 30582 1726855275.16527: variable 'ansible_search_path' from source: unknown 30582 1726855275.16567: calling self._execute() 30582 1726855275.16695: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855275.16699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855275.16702: variable 'omit' from source: magic vars 30582 1726855275.17081: variable 'ansible_distribution_major_version' from source: facts 30582 1726855275.17105: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855275.17260: variable 'network_provider' from source: set_fact 30582 1726855275.17263: Evaluated conditional (network_provider == "nm"): True 30582 1726855275.17332: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855275.17426: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855275.17604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855275.19749: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855275.19994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855275.19998: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855275.20000: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855275.20002: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855275.20026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855275.20059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855275.20094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855275.20143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855275.20162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855275.20221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855275.20252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855275.20280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855275.20328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855275.20352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855275.20402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855275.20429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855275.20461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855275.20507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855275.20526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855275.20673: variable 'network_connections' from source: include params 30582 1726855275.20696: variable 'interface' from source: play vars 30582 1726855275.20774: variable 'interface' from source: play vars 30582 1726855275.20855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855275.21103: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855275.21106: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855275.21114: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855275.21146: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855275.21193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855275.21225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855275.21253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855275.21281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855275.21338: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855275.21771: variable 'network_connections' from source: include params 30582 1726855275.21775: variable 'interface' from source: play vars 30582 1726855275.21826: variable 'interface' from source: play vars 30582 1726855275.21857: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855275.21860: when evaluation is False, skipping this task 30582 1726855275.21864: _execute() done 30582 1726855275.21872: dumping result to json 30582 1726855275.21874: done dumping result, returning 30582 1726855275.21884: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-000000000219] 30582 1726855275.21899: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000219 30582 1726855275.21978: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000219 30582 1726855275.21981: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855275.22031: no more pending results, returning what we have 30582 1726855275.22035: results queue empty 30582 1726855275.22036: checking for any_errors_fatal 30582 1726855275.22054: done checking for any_errors_fatal 30582 1726855275.22055: checking for max_fail_percentage 30582 1726855275.22057: done checking for max_fail_percentage 30582 1726855275.22058: checking to see if all hosts have failed and the running result is not ok 30582 1726855275.22059: done checking to see if all hosts have failed 30582 1726855275.22059: getting the remaining hosts for this loop 30582 1726855275.22061: done getting the remaining hosts for this loop 30582 1726855275.22065: getting the next task for host managed_node3 30582 1726855275.22072: done getting next task for host managed_node3 30582 1726855275.22076: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855275.22080: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855275.22098: getting variables 30582 1726855275.22100: in VariableManager get_vars() 30582 1726855275.22136: Calling all_inventory to load vars for managed_node3 30582 1726855275.22139: Calling groups_inventory to load vars for managed_node3 30582 1726855275.22141: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855275.22151: Calling all_plugins_play to load vars for managed_node3 30582 1726855275.22154: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855275.22156: Calling groups_plugins_play to load vars for managed_node3 30582 1726855275.23035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855275.24166: done with get_vars() 30582 1726855275.24194: done getting variables 30582 1726855275.24251: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:01:15 -0400 (0:00:00.086) 0:00:11.592 ****** 30582 1726855275.24282: entering _queue_task() for managed_node3/service 30582 1726855275.24653: worker is 1 (out of 1 available) 30582 1726855275.24667: exiting _queue_task() for managed_node3/service 30582 1726855275.24686: done queuing things up, now waiting for results queue to drain 30582 1726855275.24690: waiting for pending results... 30582 1726855275.24876: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855275.24948: in run() - task 0affcc66-ac2b-aa83-7d57-00000000021a 30582 1726855275.24959: variable 'ansible_search_path' from source: unknown 30582 1726855275.24963: variable 'ansible_search_path' from source: unknown 30582 1726855275.25022: calling self._execute() 30582 1726855275.25078: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855275.25086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855275.25094: variable 'omit' from source: magic vars 30582 1726855275.25380: variable 'ansible_distribution_major_version' from source: facts 30582 1726855275.25390: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855275.25471: variable 'network_provider' from source: set_fact 30582 1726855275.25475: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855275.25479: when evaluation is False, skipping this task 30582 1726855275.25481: _execute() done 30582 1726855275.25486: dumping result to json 30582 1726855275.25491: done dumping result, returning 30582 1726855275.25500: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-00000000021a] 30582 1726855275.25504: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021a 30582 1726855275.25590: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021a 30582 1726855275.25593: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855275.25638: no more pending results, returning what we have 30582 1726855275.25642: results queue empty 30582 1726855275.25643: checking for any_errors_fatal 30582 1726855275.25652: done checking for any_errors_fatal 30582 1726855275.25652: checking for max_fail_percentage 30582 1726855275.25654: done checking for max_fail_percentage 30582 1726855275.25655: checking to see if all hosts have failed and the running result is not ok 30582 1726855275.25656: done checking to see if all hosts have failed 30582 1726855275.25656: getting the remaining hosts for this loop 30582 1726855275.25658: done getting the remaining hosts for this loop 30582 1726855275.25661: getting the next task for host managed_node3 30582 1726855275.25670: done getting next task for host managed_node3 30582 1726855275.25673: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855275.25677: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855275.25695: getting variables 30582 1726855275.25697: in VariableManager get_vars() 30582 1726855275.25730: Calling all_inventory to load vars for managed_node3 30582 1726855275.25733: Calling groups_inventory to load vars for managed_node3 30582 1726855275.25735: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855275.25744: Calling all_plugins_play to load vars for managed_node3 30582 1726855275.25746: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855275.25748: Calling groups_plugins_play to load vars for managed_node3 30582 1726855275.26721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855275.28709: done with get_vars() 30582 1726855275.28731: done getting variables 30582 1726855275.28775: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:01:15 -0400 (0:00:00.045) 0:00:11.637 ****** 30582 1726855275.28806: entering _queue_task() for managed_node3/copy 30582 1726855275.29051: worker is 1 (out of 1 available) 30582 1726855275.29064: exiting _queue_task() for managed_node3/copy 30582 1726855275.29078: done queuing things up, now waiting for results queue to drain 30582 1726855275.29079: waiting for pending results... 30582 1726855275.29258: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855275.29340: in run() - task 0affcc66-ac2b-aa83-7d57-00000000021b 30582 1726855275.29352: variable 'ansible_search_path' from source: unknown 30582 1726855275.29355: variable 'ansible_search_path' from source: unknown 30582 1726855275.29399: calling self._execute() 30582 1726855275.29599: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855275.29603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855275.29605: variable 'omit' from source: magic vars 30582 1726855275.29973: variable 'ansible_distribution_major_version' from source: facts 30582 1726855275.29977: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855275.30133: variable 'network_provider' from source: set_fact 30582 1726855275.30140: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855275.30143: when evaluation is False, skipping this task 30582 1726855275.30145: _execute() done 30582 1726855275.30148: dumping result to json 30582 1726855275.30152: done dumping result, returning 30582 1726855275.30202: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-00000000021b] 30582 1726855275.30205: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021b 30582 1726855275.30463: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021b 30582 1726855275.30466: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855275.30522: no more pending results, returning what we have 30582 1726855275.30527: results queue empty 30582 1726855275.30528: checking for any_errors_fatal 30582 1726855275.30537: done checking for any_errors_fatal 30582 1726855275.30538: checking for max_fail_percentage 30582 1726855275.30540: done checking for max_fail_percentage 30582 1726855275.30541: checking to see if all hosts have failed and the running result is not ok 30582 1726855275.30541: done checking to see if all hosts have failed 30582 1726855275.30542: getting the remaining hosts for this loop 30582 1726855275.30543: done getting the remaining hosts for this loop 30582 1726855275.30547: getting the next task for host managed_node3 30582 1726855275.30555: done getting next task for host managed_node3 30582 1726855275.30558: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855275.30563: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855275.30577: getting variables 30582 1726855275.30579: in VariableManager get_vars() 30582 1726855275.30618: Calling all_inventory to load vars for managed_node3 30582 1726855275.30621: Calling groups_inventory to load vars for managed_node3 30582 1726855275.30623: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855275.30635: Calling all_plugins_play to load vars for managed_node3 30582 1726855275.30638: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855275.30641: Calling groups_plugins_play to load vars for managed_node3 30582 1726855275.32924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855275.35108: done with get_vars() 30582 1726855275.35132: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:01:15 -0400 (0:00:00.064) 0:00:11.702 ****** 30582 1726855275.35224: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855275.35226: Creating lock for fedora.linux_system_roles.network_connections 30582 1726855275.35561: worker is 1 (out of 1 available) 30582 1726855275.35572: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855275.35584: done queuing things up, now waiting for results queue to drain 30582 1726855275.35585: waiting for pending results... 30582 1726855275.35998: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855275.36086: in run() - task 0affcc66-ac2b-aa83-7d57-00000000021c 30582 1726855275.36167: variable 'ansible_search_path' from source: unknown 30582 1726855275.36170: variable 'ansible_search_path' from source: unknown 30582 1726855275.36174: calling self._execute() 30582 1726855275.36252: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855275.36263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855275.36282: variable 'omit' from source: magic vars 30582 1726855275.36691: variable 'ansible_distribution_major_version' from source: facts 30582 1726855275.36714: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855275.36726: variable 'omit' from source: magic vars 30582 1726855275.36821: variable 'omit' from source: magic vars 30582 1726855275.36962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855275.39205: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855275.39299: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855275.39381: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855275.39384: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855275.39408: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855275.39500: variable 'network_provider' from source: set_fact 30582 1726855275.39636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855275.39667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855275.39709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855275.39757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855275.39794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855275.39863: variable 'omit' from source: magic vars 30582 1726855275.39984: variable 'omit' from source: magic vars 30582 1726855275.40138: variable 'network_connections' from source: include params 30582 1726855275.40143: variable 'interface' from source: play vars 30582 1726855275.40209: variable 'interface' from source: play vars 30582 1726855275.40382: variable 'omit' from source: magic vars 30582 1726855275.40594: variable '__lsr_ansible_managed' from source: task vars 30582 1726855275.40597: variable '__lsr_ansible_managed' from source: task vars 30582 1726855275.40637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855275.40866: Loaded config def from plugin (lookup/template) 30582 1726855275.40875: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855275.40908: File lookup term: get_ansible_managed.j2 30582 1726855275.40915: variable 'ansible_search_path' from source: unknown 30582 1726855275.40930: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855275.40947: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855275.40968: variable 'ansible_search_path' from source: unknown 30582 1726855275.47328: variable 'ansible_managed' from source: unknown 30582 1726855275.47470: variable 'omit' from source: magic vars 30582 1726855275.47508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855275.47544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855275.47565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855275.47593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855275.47611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855275.47648: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855275.47659: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855275.47667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855275.47773: Set connection var ansible_timeout to 10 30582 1726855275.47793: Set connection var ansible_connection to ssh 30582 1726855275.47796: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855275.47804: Set connection var ansible_pipelining to False 30582 1726855275.47994: Set connection var ansible_shell_executable to /bin/sh 30582 1726855275.47997: Set connection var ansible_shell_type to sh 30582 1726855275.47999: variable 'ansible_shell_executable' from source: unknown 30582 1726855275.48001: variable 'ansible_connection' from source: unknown 30582 1726855275.48003: variable 'ansible_module_compression' from source: unknown 30582 1726855275.48005: variable 'ansible_shell_type' from source: unknown 30582 1726855275.48006: variable 'ansible_shell_executable' from source: unknown 30582 1726855275.48009: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855275.48011: variable 'ansible_pipelining' from source: unknown 30582 1726855275.48012: variable 'ansible_timeout' from source: unknown 30582 1726855275.48014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855275.48016: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855275.48026: variable 'omit' from source: magic vars 30582 1726855275.48041: starting attempt loop 30582 1726855275.48047: running the handler 30582 1726855275.48063: _low_level_execute_command(): starting 30582 1726855275.48073: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855275.48913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855275.48928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855275.49026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855275.50706: stdout chunk (state=3): >>>/root <<< 30582 1726855275.50844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855275.50856: stdout chunk (state=3): >>><<< 30582 1726855275.50872: stderr chunk (state=3): >>><<< 30582 1726855275.50906: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855275.50925: _low_level_execute_command(): starting 30582 1726855275.50935: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654 `" && echo ansible-tmp-1726855275.509133-31104-87221170867654="` echo /root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654 `" ) && sleep 0' 30582 1726855275.51547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855275.51556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855275.51567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855275.51581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855275.51598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855275.51620: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855275.51624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855275.51626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855275.51637: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855275.51640: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855275.51649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855275.51659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855275.51670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855275.51680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855275.51696: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855275.51700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855275.51885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855275.53790: stdout chunk (state=3): >>>ansible-tmp-1726855275.509133-31104-87221170867654=/root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654 <<< 30582 1726855275.53891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855275.53922: stderr chunk (state=3): >>><<< 30582 1726855275.53925: stdout chunk (state=3): >>><<< 30582 1726855275.53942: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855275.509133-31104-87221170867654=/root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855275.53997: variable 'ansible_module_compression' from source: unknown 30582 1726855275.54100: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 30582 1726855275.54103: ANSIBALLZ: Acquiring lock 30582 1726855275.54108: ANSIBALLZ: Lock acquired: 140270804820304 30582 1726855275.54110: ANSIBALLZ: Creating module 30582 1726855275.74096: ANSIBALLZ: Writing module into payload 30582 1726855275.74382: ANSIBALLZ: Writing module 30582 1726855275.74416: ANSIBALLZ: Renaming module 30582 1726855275.74427: ANSIBALLZ: Done creating module 30582 1726855275.74461: variable 'ansible_facts' from source: unknown 30582 1726855275.74590: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/AnsiballZ_network_connections.py 30582 1726855275.74807: Sending initial data 30582 1726855275.74810: Sent initial data (166 bytes) 30582 1726855275.75392: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855275.75406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855275.75419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855275.75442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855275.75538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855275.75597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855275.75698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855275.77336: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855275.77386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855275.77461: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpp2z6g0h4 /root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/AnsiballZ_network_connections.py <<< 30582 1726855275.77465: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/AnsiballZ_network_connections.py" <<< 30582 1726855275.77533: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpp2z6g0h4" to remote "/root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/AnsiballZ_network_connections.py" <<< 30582 1726855275.78644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855275.78698: stderr chunk (state=3): >>><<< 30582 1726855275.78704: stdout chunk (state=3): >>><<< 30582 1726855275.78734: done transferring module to remote 30582 1726855275.78744: _low_level_execute_command(): starting 30582 1726855275.78749: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/ /root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/AnsiballZ_network_connections.py && sleep 0' 30582 1726855275.79331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855275.79340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855275.79351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855275.79393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855275.79397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855275.79399: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855275.79401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855275.79411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855275.79504: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855275.79516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855275.79608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855275.81385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855275.81433: stderr chunk (state=3): >>><<< 30582 1726855275.81444: stdout chunk (state=3): >>><<< 30582 1726855275.81544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855275.81547: _low_level_execute_command(): starting 30582 1726855275.81550: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/AnsiballZ_network_connections.py && sleep 0' 30582 1726855275.82075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855275.82093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855275.82108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855275.82129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855275.82144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855275.82155: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855275.82242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855275.82267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855275.82283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855275.82307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855275.82410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855276.10202: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 9fc70a3d-08d2-4d99-b645-a6e60c4199d8\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855276.14016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855276.14045: stderr chunk (state=3): >>><<< 30582 1726855276.14048: stdout chunk (state=3): >>><<< 30582 1726855276.14063: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 9fc70a3d-08d2-4d99-b645-a6e60c4199d8\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855276.14096: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855276.14104: _low_level_execute_command(): starting 30582 1726855276.14109: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855275.509133-31104-87221170867654/ > /dev/null 2>&1 && sleep 0' 30582 1726855276.14566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.14570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.14572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855276.14575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855276.14577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.14629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855276.14633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855276.14638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855276.14700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855276.16562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855276.16590: stderr chunk (state=3): >>><<< 30582 1726855276.16593: stdout chunk (state=3): >>><<< 30582 1726855276.16606: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855276.16613: handler run complete 30582 1726855276.16639: attempt loop complete, returning result 30582 1726855276.16642: _execute() done 30582 1726855276.16645: dumping result to json 30582 1726855276.16649: done dumping result, returning 30582 1726855276.16658: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-00000000021c] 30582 1726855276.16660: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021c 30582 1726855276.16760: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021c 30582 1726855276.16763: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 9fc70a3d-08d2-4d99-b645-a6e60c4199d8 30582 1726855276.16861: no more pending results, returning what we have 30582 1726855276.16864: results queue empty 30582 1726855276.16865: checking for any_errors_fatal 30582 1726855276.16871: done checking for any_errors_fatal 30582 1726855276.16873: checking for max_fail_percentage 30582 1726855276.16875: done checking for max_fail_percentage 30582 1726855276.16876: checking to see if all hosts have failed and the running result is not ok 30582 1726855276.16876: done checking to see if all hosts have failed 30582 1726855276.16877: getting the remaining hosts for this loop 30582 1726855276.16878: done getting the remaining hosts for this loop 30582 1726855276.16882: getting the next task for host managed_node3 30582 1726855276.16892: done getting next task for host managed_node3 30582 1726855276.16896: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855276.16900: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855276.16912: getting variables 30582 1726855276.16913: in VariableManager get_vars() 30582 1726855276.16947: Calling all_inventory to load vars for managed_node3 30582 1726855276.16950: Calling groups_inventory to load vars for managed_node3 30582 1726855276.16952: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.16961: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.16964: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.16966: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.17792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.18651: done with get_vars() 30582 1726855276.18668: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:01:16 -0400 (0:00:00.835) 0:00:12.537 ****** 30582 1726855276.18733: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855276.18735: Creating lock for fedora.linux_system_roles.network_state 30582 1726855276.18967: worker is 1 (out of 1 available) 30582 1726855276.18981: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855276.18997: done queuing things up, now waiting for results queue to drain 30582 1726855276.19000: waiting for pending results... 30582 1726855276.19170: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855276.19253: in run() - task 0affcc66-ac2b-aa83-7d57-00000000021d 30582 1726855276.19265: variable 'ansible_search_path' from source: unknown 30582 1726855276.19268: variable 'ansible_search_path' from source: unknown 30582 1726855276.19298: calling self._execute() 30582 1726855276.19365: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.19368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.19377: variable 'omit' from source: magic vars 30582 1726855276.19645: variable 'ansible_distribution_major_version' from source: facts 30582 1726855276.19657: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855276.19742: variable 'network_state' from source: role '' defaults 30582 1726855276.19749: Evaluated conditional (network_state != {}): False 30582 1726855276.19752: when evaluation is False, skipping this task 30582 1726855276.19755: _execute() done 30582 1726855276.19757: dumping result to json 30582 1726855276.19762: done dumping result, returning 30582 1726855276.19775: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-00000000021d] 30582 1726855276.19777: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021d 30582 1726855276.19855: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021d 30582 1726855276.19858: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855276.19936: no more pending results, returning what we have 30582 1726855276.19939: results queue empty 30582 1726855276.19940: checking for any_errors_fatal 30582 1726855276.19948: done checking for any_errors_fatal 30582 1726855276.19949: checking for max_fail_percentage 30582 1726855276.19951: done checking for max_fail_percentage 30582 1726855276.19951: checking to see if all hosts have failed and the running result is not ok 30582 1726855276.19952: done checking to see if all hosts have failed 30582 1726855276.19953: getting the remaining hosts for this loop 30582 1726855276.19954: done getting the remaining hosts for this loop 30582 1726855276.19958: getting the next task for host managed_node3 30582 1726855276.19965: done getting next task for host managed_node3 30582 1726855276.19968: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855276.19973: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855276.19997: getting variables 30582 1726855276.19998: in VariableManager get_vars() 30582 1726855276.20028: Calling all_inventory to load vars for managed_node3 30582 1726855276.20030: Calling groups_inventory to load vars for managed_node3 30582 1726855276.20032: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.20040: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.20042: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.20044: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.20860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.21711: done with get_vars() 30582 1726855276.21727: done getting variables 30582 1726855276.21768: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:01:16 -0400 (0:00:00.030) 0:00:12.567 ****** 30582 1726855276.21795: entering _queue_task() for managed_node3/debug 30582 1726855276.22011: worker is 1 (out of 1 available) 30582 1726855276.22025: exiting _queue_task() for managed_node3/debug 30582 1726855276.22035: done queuing things up, now waiting for results queue to drain 30582 1726855276.22037: waiting for pending results... 30582 1726855276.22211: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855276.22298: in run() - task 0affcc66-ac2b-aa83-7d57-00000000021e 30582 1726855276.22309: variable 'ansible_search_path' from source: unknown 30582 1726855276.22312: variable 'ansible_search_path' from source: unknown 30582 1726855276.22340: calling self._execute() 30582 1726855276.22408: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.22412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.22421: variable 'omit' from source: magic vars 30582 1726855276.22698: variable 'ansible_distribution_major_version' from source: facts 30582 1726855276.22711: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855276.22715: variable 'omit' from source: magic vars 30582 1726855276.22759: variable 'omit' from source: magic vars 30582 1726855276.22782: variable 'omit' from source: magic vars 30582 1726855276.22819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855276.22844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855276.22859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855276.22872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855276.22883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855276.22910: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855276.22913: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.22916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.22988: Set connection var ansible_timeout to 10 30582 1726855276.22992: Set connection var ansible_connection to ssh 30582 1726855276.22999: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855276.23004: Set connection var ansible_pipelining to False 30582 1726855276.23009: Set connection var ansible_shell_executable to /bin/sh 30582 1726855276.23011: Set connection var ansible_shell_type to sh 30582 1726855276.23029: variable 'ansible_shell_executable' from source: unknown 30582 1726855276.23032: variable 'ansible_connection' from source: unknown 30582 1726855276.23035: variable 'ansible_module_compression' from source: unknown 30582 1726855276.23037: variable 'ansible_shell_type' from source: unknown 30582 1726855276.23040: variable 'ansible_shell_executable' from source: unknown 30582 1726855276.23042: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.23044: variable 'ansible_pipelining' from source: unknown 30582 1726855276.23046: variable 'ansible_timeout' from source: unknown 30582 1726855276.23048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.23149: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855276.23158: variable 'omit' from source: magic vars 30582 1726855276.23161: starting attempt loop 30582 1726855276.23164: running the handler 30582 1726855276.23259: variable '__network_connections_result' from source: set_fact 30582 1726855276.23303: handler run complete 30582 1726855276.23316: attempt loop complete, returning result 30582 1726855276.23319: _execute() done 30582 1726855276.23322: dumping result to json 30582 1726855276.23324: done dumping result, returning 30582 1726855276.23332: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-00000000021e] 30582 1726855276.23337: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021e 30582 1726855276.23426: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021e 30582 1726855276.23429: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 9fc70a3d-08d2-4d99-b645-a6e60c4199d8" ] } 30582 1726855276.23491: no more pending results, returning what we have 30582 1726855276.23495: results queue empty 30582 1726855276.23496: checking for any_errors_fatal 30582 1726855276.23502: done checking for any_errors_fatal 30582 1726855276.23503: checking for max_fail_percentage 30582 1726855276.23505: done checking for max_fail_percentage 30582 1726855276.23506: checking to see if all hosts have failed and the running result is not ok 30582 1726855276.23507: done checking to see if all hosts have failed 30582 1726855276.23507: getting the remaining hosts for this loop 30582 1726855276.23509: done getting the remaining hosts for this loop 30582 1726855276.23512: getting the next task for host managed_node3 30582 1726855276.23519: done getting next task for host managed_node3 30582 1726855276.23522: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855276.23527: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855276.23536: getting variables 30582 1726855276.23540: in VariableManager get_vars() 30582 1726855276.23570: Calling all_inventory to load vars for managed_node3 30582 1726855276.23572: Calling groups_inventory to load vars for managed_node3 30582 1726855276.23574: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.23582: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.23584: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.23586: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.24331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.25276: done with get_vars() 30582 1726855276.25295: done getting variables 30582 1726855276.25339: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:01:16 -0400 (0:00:00.035) 0:00:12.603 ****** 30582 1726855276.25366: entering _queue_task() for managed_node3/debug 30582 1726855276.25585: worker is 1 (out of 1 available) 30582 1726855276.25601: exiting _queue_task() for managed_node3/debug 30582 1726855276.25613: done queuing things up, now waiting for results queue to drain 30582 1726855276.25614: waiting for pending results... 30582 1726855276.25789: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855276.25880: in run() - task 0affcc66-ac2b-aa83-7d57-00000000021f 30582 1726855276.25894: variable 'ansible_search_path' from source: unknown 30582 1726855276.25898: variable 'ansible_search_path' from source: unknown 30582 1726855276.25926: calling self._execute() 30582 1726855276.25992: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.25999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.26007: variable 'omit' from source: magic vars 30582 1726855276.26281: variable 'ansible_distribution_major_version' from source: facts 30582 1726855276.26290: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855276.26297: variable 'omit' from source: magic vars 30582 1726855276.26340: variable 'omit' from source: magic vars 30582 1726855276.26363: variable 'omit' from source: magic vars 30582 1726855276.26402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855276.26428: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855276.26443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855276.26456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855276.26466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855276.26497: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855276.26500: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.26502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.26571: Set connection var ansible_timeout to 10 30582 1726855276.26574: Set connection var ansible_connection to ssh 30582 1726855276.26579: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855276.26584: Set connection var ansible_pipelining to False 30582 1726855276.26593: Set connection var ansible_shell_executable to /bin/sh 30582 1726855276.26598: Set connection var ansible_shell_type to sh 30582 1726855276.26617: variable 'ansible_shell_executable' from source: unknown 30582 1726855276.26620: variable 'ansible_connection' from source: unknown 30582 1726855276.26622: variable 'ansible_module_compression' from source: unknown 30582 1726855276.26625: variable 'ansible_shell_type' from source: unknown 30582 1726855276.26627: variable 'ansible_shell_executable' from source: unknown 30582 1726855276.26629: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.26631: variable 'ansible_pipelining' from source: unknown 30582 1726855276.26633: variable 'ansible_timeout' from source: unknown 30582 1726855276.26635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.26742: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855276.26750: variable 'omit' from source: magic vars 30582 1726855276.26755: starting attempt loop 30582 1726855276.26758: running the handler 30582 1726855276.26799: variable '__network_connections_result' from source: set_fact 30582 1726855276.26859: variable '__network_connections_result' from source: set_fact 30582 1726855276.26938: handler run complete 30582 1726855276.26957: attempt loop complete, returning result 30582 1726855276.26960: _execute() done 30582 1726855276.26962: dumping result to json 30582 1726855276.26964: done dumping result, returning 30582 1726855276.26973: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-00000000021f] 30582 1726855276.26977: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021f 30582 1726855276.27065: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000021f 30582 1726855276.27068: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 9fc70a3d-08d2-4d99-b645-a6e60c4199d8\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 9fc70a3d-08d2-4d99-b645-a6e60c4199d8" ] } } 30582 1726855276.27149: no more pending results, returning what we have 30582 1726855276.27152: results queue empty 30582 1726855276.27153: checking for any_errors_fatal 30582 1726855276.27161: done checking for any_errors_fatal 30582 1726855276.27162: checking for max_fail_percentage 30582 1726855276.27163: done checking for max_fail_percentage 30582 1726855276.27164: checking to see if all hosts have failed and the running result is not ok 30582 1726855276.27165: done checking to see if all hosts have failed 30582 1726855276.27166: getting the remaining hosts for this loop 30582 1726855276.27167: done getting the remaining hosts for this loop 30582 1726855276.27170: getting the next task for host managed_node3 30582 1726855276.27180: done getting next task for host managed_node3 30582 1726855276.27183: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855276.27186: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855276.27198: getting variables 30582 1726855276.27200: in VariableManager get_vars() 30582 1726855276.27233: Calling all_inventory to load vars for managed_node3 30582 1726855276.27235: Calling groups_inventory to load vars for managed_node3 30582 1726855276.27237: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.27245: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.27247: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.27249: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.28008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.28871: done with get_vars() 30582 1726855276.28890: done getting variables 30582 1726855276.28935: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:01:16 -0400 (0:00:00.035) 0:00:12.639 ****** 30582 1726855276.28959: entering _queue_task() for managed_node3/debug 30582 1726855276.29184: worker is 1 (out of 1 available) 30582 1726855276.29199: exiting _queue_task() for managed_node3/debug 30582 1726855276.29211: done queuing things up, now waiting for results queue to drain 30582 1726855276.29213: waiting for pending results... 30582 1726855276.29390: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855276.29483: in run() - task 0affcc66-ac2b-aa83-7d57-000000000220 30582 1726855276.29499: variable 'ansible_search_path' from source: unknown 30582 1726855276.29502: variable 'ansible_search_path' from source: unknown 30582 1726855276.29528: calling self._execute() 30582 1726855276.29596: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.29600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.29609: variable 'omit' from source: magic vars 30582 1726855276.29869: variable 'ansible_distribution_major_version' from source: facts 30582 1726855276.29879: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855276.29965: variable 'network_state' from source: role '' defaults 30582 1726855276.29973: Evaluated conditional (network_state != {}): False 30582 1726855276.29976: when evaluation is False, skipping this task 30582 1726855276.29981: _execute() done 30582 1726855276.29983: dumping result to json 30582 1726855276.29986: done dumping result, returning 30582 1726855276.29998: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-000000000220] 30582 1726855276.30001: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000220 30582 1726855276.30080: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000220 30582 1726855276.30083: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855276.30139: no more pending results, returning what we have 30582 1726855276.30143: results queue empty 30582 1726855276.30145: checking for any_errors_fatal 30582 1726855276.30153: done checking for any_errors_fatal 30582 1726855276.30153: checking for max_fail_percentage 30582 1726855276.30155: done checking for max_fail_percentage 30582 1726855276.30156: checking to see if all hosts have failed and the running result is not ok 30582 1726855276.30157: done checking to see if all hosts have failed 30582 1726855276.30157: getting the remaining hosts for this loop 30582 1726855276.30159: done getting the remaining hosts for this loop 30582 1726855276.30162: getting the next task for host managed_node3 30582 1726855276.30169: done getting next task for host managed_node3 30582 1726855276.30172: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855276.30177: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855276.30193: getting variables 30582 1726855276.30195: in VariableManager get_vars() 30582 1726855276.30222: Calling all_inventory to load vars for managed_node3 30582 1726855276.30225: Calling groups_inventory to load vars for managed_node3 30582 1726855276.30227: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.30235: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.30237: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.30239: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.31091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.31942: done with get_vars() 30582 1726855276.31958: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:01:16 -0400 (0:00:00.030) 0:00:12.670 ****** 30582 1726855276.32031: entering _queue_task() for managed_node3/ping 30582 1726855276.32032: Creating lock for ping 30582 1726855276.32298: worker is 1 (out of 1 available) 30582 1726855276.32313: exiting _queue_task() for managed_node3/ping 30582 1726855276.32324: done queuing things up, now waiting for results queue to drain 30582 1726855276.32326: waiting for pending results... 30582 1726855276.32563: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855276.32656: in run() - task 0affcc66-ac2b-aa83-7d57-000000000221 30582 1726855276.32666: variable 'ansible_search_path' from source: unknown 30582 1726855276.32669: variable 'ansible_search_path' from source: unknown 30582 1726855276.32704: calling self._execute() 30582 1726855276.32763: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.32766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.32775: variable 'omit' from source: magic vars 30582 1726855276.33041: variable 'ansible_distribution_major_version' from source: facts 30582 1726855276.33051: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855276.33057: variable 'omit' from source: magic vars 30582 1726855276.33101: variable 'omit' from source: magic vars 30582 1726855276.33127: variable 'omit' from source: magic vars 30582 1726855276.33157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855276.33182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855276.33202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855276.33217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855276.33229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855276.33250: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855276.33254: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.33256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.33334: Set connection var ansible_timeout to 10 30582 1726855276.33341: Set connection var ansible_connection to ssh 30582 1726855276.33343: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855276.33346: Set connection var ansible_pipelining to False 30582 1726855276.33348: Set connection var ansible_shell_executable to /bin/sh 30582 1726855276.33350: Set connection var ansible_shell_type to sh 30582 1726855276.33365: variable 'ansible_shell_executable' from source: unknown 30582 1726855276.33367: variable 'ansible_connection' from source: unknown 30582 1726855276.33370: variable 'ansible_module_compression' from source: unknown 30582 1726855276.33372: variable 'ansible_shell_type' from source: unknown 30582 1726855276.33375: variable 'ansible_shell_executable' from source: unknown 30582 1726855276.33377: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.33381: variable 'ansible_pipelining' from source: unknown 30582 1726855276.33383: variable 'ansible_timeout' from source: unknown 30582 1726855276.33388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.33533: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855276.33541: variable 'omit' from source: magic vars 30582 1726855276.33548: starting attempt loop 30582 1726855276.33551: running the handler 30582 1726855276.33562: _low_level_execute_command(): starting 30582 1726855276.33570: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855276.34073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.34076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.34079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.34081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.34139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855276.34142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855276.34147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855276.34216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855276.35912: stdout chunk (state=3): >>>/root <<< 30582 1726855276.36014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855276.36039: stderr chunk (state=3): >>><<< 30582 1726855276.36042: stdout chunk (state=3): >>><<< 30582 1726855276.36064: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855276.36075: _low_level_execute_command(): starting 30582 1726855276.36081: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482 `" && echo ansible-tmp-1726855276.3606403-31143-52596731845482="` echo /root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482 `" ) && sleep 0' 30582 1726855276.36496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.36499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.36508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855276.36510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855276.36512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.36551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855276.36554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855276.36621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855276.38524: stdout chunk (state=3): >>>ansible-tmp-1726855276.3606403-31143-52596731845482=/root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482 <<< 30582 1726855276.38798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855276.38801: stdout chunk (state=3): >>><<< 30582 1726855276.38803: stderr chunk (state=3): >>><<< 30582 1726855276.38806: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855276.3606403-31143-52596731845482=/root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855276.38808: variable 'ansible_module_compression' from source: unknown 30582 1726855276.38810: ANSIBALLZ: Using lock for ping 30582 1726855276.38812: ANSIBALLZ: Acquiring lock 30582 1726855276.38814: ANSIBALLZ: Lock acquired: 140270805496272 30582 1726855276.38816: ANSIBALLZ: Creating module 30582 1726855276.51395: ANSIBALLZ: Writing module into payload 30582 1726855276.51399: ANSIBALLZ: Writing module 30582 1726855276.51401: ANSIBALLZ: Renaming module 30582 1726855276.51403: ANSIBALLZ: Done creating module 30582 1726855276.51406: variable 'ansible_facts' from source: unknown 30582 1726855276.51555: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/AnsiballZ_ping.py 30582 1726855276.51940: Sending initial data 30582 1726855276.51954: Sent initial data (152 bytes) 30582 1726855276.52955: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855276.52967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.53007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855276.53020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.53031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.53200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855276.53212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855276.53237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855276.53420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855276.55081: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855276.55107: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855276.55310: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855276.55457: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpi2v_1sne /root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/AnsiballZ_ping.py <<< 30582 1726855276.55461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpi2v_1sne" to remote "/root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/AnsiballZ_ping.py" <<< 30582 1726855276.56896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855276.56899: stdout chunk (state=3): >>><<< 30582 1726855276.56902: stderr chunk (state=3): >>><<< 30582 1726855276.56904: done transferring module to remote 30582 1726855276.56906: _low_level_execute_command(): starting 30582 1726855276.56908: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/ /root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/AnsiballZ_ping.py && sleep 0' 30582 1726855276.57833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855276.57847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.57862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.58117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855276.58215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855276.60197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855276.60201: stdout chunk (state=3): >>><<< 30582 1726855276.60203: stderr chunk (state=3): >>><<< 30582 1726855276.60206: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855276.60208: _low_level_execute_command(): starting 30582 1726855276.60211: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/AnsiballZ_ping.py && sleep 0' 30582 1726855276.61431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.61434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.61436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.61438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855276.61490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855276.61707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855276.76562: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855276.77862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855276.77908: stderr chunk (state=3): >>><<< 30582 1726855276.77917: stdout chunk (state=3): >>><<< 30582 1726855276.77937: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855276.77967: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855276.77982: _low_level_execute_command(): starting 30582 1726855276.77994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855276.3606403-31143-52596731845482/ > /dev/null 2>&1 && sleep 0' 30582 1726855276.78582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855276.78601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855276.78614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855276.78631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855276.78729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855276.78778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855276.78907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855276.80763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855276.80810: stdout chunk (state=3): >>><<< 30582 1726855276.80830: stderr chunk (state=3): >>><<< 30582 1726855276.80850: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855276.80861: handler run complete 30582 1726855276.80882: attempt loop complete, returning result 30582 1726855276.80908: _execute() done 30582 1726855276.80915: dumping result to json 30582 1726855276.81047: done dumping result, returning 30582 1726855276.81051: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-000000000221] 30582 1726855276.81053: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000221 30582 1726855276.81134: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000221 30582 1726855276.81137: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855276.81211: no more pending results, returning what we have 30582 1726855276.81215: results queue empty 30582 1726855276.81217: checking for any_errors_fatal 30582 1726855276.81223: done checking for any_errors_fatal 30582 1726855276.81224: checking for max_fail_percentage 30582 1726855276.81226: done checking for max_fail_percentage 30582 1726855276.81227: checking to see if all hosts have failed and the running result is not ok 30582 1726855276.81228: done checking to see if all hosts have failed 30582 1726855276.81228: getting the remaining hosts for this loop 30582 1726855276.81230: done getting the remaining hosts for this loop 30582 1726855276.81235: getting the next task for host managed_node3 30582 1726855276.81496: done getting next task for host managed_node3 30582 1726855276.81499: ^ task is: TASK: meta (role_complete) 30582 1726855276.81504: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855276.81516: getting variables 30582 1726855276.81518: in VariableManager get_vars() 30582 1726855276.81557: Calling all_inventory to load vars for managed_node3 30582 1726855276.81560: Calling groups_inventory to load vars for managed_node3 30582 1726855276.81563: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.81573: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.81577: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.81580: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.83210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.85308: done with get_vars() 30582 1726855276.85338: done getting variables 30582 1726855276.85452: done queuing things up, now waiting for results queue to drain 30582 1726855276.85454: results queue empty 30582 1726855276.85455: checking for any_errors_fatal 30582 1726855276.85458: done checking for any_errors_fatal 30582 1726855276.85459: checking for max_fail_percentage 30582 1726855276.85460: done checking for max_fail_percentage 30582 1726855276.85461: checking to see if all hosts have failed and the running result is not ok 30582 1726855276.85462: done checking to see if all hosts have failed 30582 1726855276.85462: getting the remaining hosts for this loop 30582 1726855276.85463: done getting the remaining hosts for this loop 30582 1726855276.85466: getting the next task for host managed_node3 30582 1726855276.85471: done getting next task for host managed_node3 30582 1726855276.85474: ^ task is: TASK: Show result 30582 1726855276.85477: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855276.85480: getting variables 30582 1726855276.85481: in VariableManager get_vars() 30582 1726855276.85496: Calling all_inventory to load vars for managed_node3 30582 1726855276.85498: Calling groups_inventory to load vars for managed_node3 30582 1726855276.85500: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.85505: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.85508: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.85510: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.86347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.87221: done with get_vars() 30582 1726855276.87240: done getting variables 30582 1726855276.87280: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 14:01:16 -0400 (0:00:00.552) 0:00:13.223 ****** 30582 1726855276.87314: entering _queue_task() for managed_node3/debug 30582 1726855276.87838: worker is 1 (out of 1 available) 30582 1726855276.87851: exiting _queue_task() for managed_node3/debug 30582 1726855276.87863: done queuing things up, now waiting for results queue to drain 30582 1726855276.87864: waiting for pending results... 30582 1726855276.88514: running TaskExecutor() for managed_node3/TASK: Show result 30582 1726855276.88557: in run() - task 0affcc66-ac2b-aa83-7d57-00000000018f 30582 1726855276.88576: variable 'ansible_search_path' from source: unknown 30582 1726855276.88593: variable 'ansible_search_path' from source: unknown 30582 1726855276.88632: calling self._execute() 30582 1726855276.88826: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.88830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.88832: variable 'omit' from source: magic vars 30582 1726855276.89125: variable 'ansible_distribution_major_version' from source: facts 30582 1726855276.89136: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855276.89142: variable 'omit' from source: magic vars 30582 1726855276.89179: variable 'omit' from source: magic vars 30582 1726855276.89206: variable 'omit' from source: magic vars 30582 1726855276.89235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855276.89261: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855276.89282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855276.89301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855276.89310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855276.89334: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855276.89337: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.89339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.89420: Set connection var ansible_timeout to 10 30582 1726855276.89423: Set connection var ansible_connection to ssh 30582 1726855276.89428: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855276.89433: Set connection var ansible_pipelining to False 30582 1726855276.89438: Set connection var ansible_shell_executable to /bin/sh 30582 1726855276.89440: Set connection var ansible_shell_type to sh 30582 1726855276.89456: variable 'ansible_shell_executable' from source: unknown 30582 1726855276.89459: variable 'ansible_connection' from source: unknown 30582 1726855276.89462: variable 'ansible_module_compression' from source: unknown 30582 1726855276.89464: variable 'ansible_shell_type' from source: unknown 30582 1726855276.89466: variable 'ansible_shell_executable' from source: unknown 30582 1726855276.89468: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.89475: variable 'ansible_pipelining' from source: unknown 30582 1726855276.89477: variable 'ansible_timeout' from source: unknown 30582 1726855276.89479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.89578: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855276.89586: variable 'omit' from source: magic vars 30582 1726855276.89597: starting attempt loop 30582 1726855276.89603: running the handler 30582 1726855276.89642: variable '__network_connections_result' from source: set_fact 30582 1726855276.89697: variable '__network_connections_result' from source: set_fact 30582 1726855276.89773: handler run complete 30582 1726855276.89791: attempt loop complete, returning result 30582 1726855276.89795: _execute() done 30582 1726855276.89797: dumping result to json 30582 1726855276.89802: done dumping result, returning 30582 1726855276.89815: done running TaskExecutor() for managed_node3/TASK: Show result [0affcc66-ac2b-aa83-7d57-00000000018f] 30582 1726855276.89818: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000018f 30582 1726855276.89901: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000018f 30582 1726855276.89904: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 9fc70a3d-08d2-4d99-b645-a6e60c4199d8\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 9fc70a3d-08d2-4d99-b645-a6e60c4199d8" ] } } 30582 1726855276.90007: no more pending results, returning what we have 30582 1726855276.90010: results queue empty 30582 1726855276.90012: checking for any_errors_fatal 30582 1726855276.90013: done checking for any_errors_fatal 30582 1726855276.90014: checking for max_fail_percentage 30582 1726855276.90015: done checking for max_fail_percentage 30582 1726855276.90016: checking to see if all hosts have failed and the running result is not ok 30582 1726855276.90017: done checking to see if all hosts have failed 30582 1726855276.90018: getting the remaining hosts for this loop 30582 1726855276.90019: done getting the remaining hosts for this loop 30582 1726855276.90023: getting the next task for host managed_node3 30582 1726855276.90037: done getting next task for host managed_node3 30582 1726855276.90040: ^ task is: TASK: Asserts 30582 1726855276.90042: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855276.90047: getting variables 30582 1726855276.90048: in VariableManager get_vars() 30582 1726855276.90072: Calling all_inventory to load vars for managed_node3 30582 1726855276.90074: Calling groups_inventory to load vars for managed_node3 30582 1726855276.90077: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.90089: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.90092: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.90095: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.90854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.92476: done with get_vars() 30582 1726855276.92495: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 14:01:16 -0400 (0:00:00.052) 0:00:13.275 ****** 30582 1726855276.92564: entering _queue_task() for managed_node3/include_tasks 30582 1726855276.92809: worker is 1 (out of 1 available) 30582 1726855276.92822: exiting _queue_task() for managed_node3/include_tasks 30582 1726855276.92834: done queuing things up, now waiting for results queue to drain 30582 1726855276.92836: waiting for pending results... 30582 1726855276.93011: running TaskExecutor() for managed_node3/TASK: Asserts 30582 1726855276.93076: in run() - task 0affcc66-ac2b-aa83-7d57-000000000096 30582 1726855276.93090: variable 'ansible_search_path' from source: unknown 30582 1726855276.93093: variable 'ansible_search_path' from source: unknown 30582 1726855276.93127: variable 'lsr_assert' from source: include params 30582 1726855276.93284: variable 'lsr_assert' from source: include params 30582 1726855276.93334: variable 'omit' from source: magic vars 30582 1726855276.93423: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855276.93430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855276.93438: variable 'omit' from source: magic vars 30582 1726855276.93599: variable 'ansible_distribution_major_version' from source: facts 30582 1726855276.93607: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855276.93617: variable 'item' from source: unknown 30582 1726855276.93658: variable 'item' from source: unknown 30582 1726855276.93680: variable 'item' from source: unknown 30582 1726855276.93728: variable 'item' from source: unknown 30582 1726855276.93859: dumping result to json 30582 1726855276.93862: done dumping result, returning 30582 1726855276.93864: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcc66-ac2b-aa83-7d57-000000000096] 30582 1726855276.93866: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000096 30582 1726855276.93907: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000096 30582 1726855276.93910: WORKER PROCESS EXITING 30582 1726855276.93938: no more pending results, returning what we have 30582 1726855276.93943: in VariableManager get_vars() 30582 1726855276.93974: Calling all_inventory to load vars for managed_node3 30582 1726855276.93977: Calling groups_inventory to load vars for managed_node3 30582 1726855276.93979: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.93994: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.93997: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.94000: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.95094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.96418: done with get_vars() 30582 1726855276.96432: variable 'ansible_search_path' from source: unknown 30582 1726855276.96433: variable 'ansible_search_path' from source: unknown 30582 1726855276.96462: we have included files to process 30582 1726855276.96463: generating all_blocks data 30582 1726855276.96465: done generating all_blocks data 30582 1726855276.96468: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30582 1726855276.96469: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30582 1726855276.96471: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30582 1726855276.96606: in VariableManager get_vars() 30582 1726855276.96619: done with get_vars() 30582 1726855276.96791: done processing included file 30582 1726855276.96793: iterating over new_blocks loaded from include file 30582 1726855276.96794: in VariableManager get_vars() 30582 1726855276.96804: done with get_vars() 30582 1726855276.96805: filtering new block on tags 30582 1726855276.96838: done filtering new block on tags 30582 1726855276.96839: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=tasks/assert_profile_present.yml) 30582 1726855276.96843: extending task lists for all hosts with included blocks 30582 1726855276.97454: done extending task lists 30582 1726855276.97455: done processing included files 30582 1726855276.97456: results queue empty 30582 1726855276.97456: checking for any_errors_fatal 30582 1726855276.97460: done checking for any_errors_fatal 30582 1726855276.97460: checking for max_fail_percentage 30582 1726855276.97461: done checking for max_fail_percentage 30582 1726855276.97461: checking to see if all hosts have failed and the running result is not ok 30582 1726855276.97462: done checking to see if all hosts have failed 30582 1726855276.97462: getting the remaining hosts for this loop 30582 1726855276.97463: done getting the remaining hosts for this loop 30582 1726855276.97465: getting the next task for host managed_node3 30582 1726855276.97468: done getting next task for host managed_node3 30582 1726855276.97469: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30582 1726855276.97471: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855276.97473: getting variables 30582 1726855276.97474: in VariableManager get_vars() 30582 1726855276.97481: Calling all_inventory to load vars for managed_node3 30582 1726855276.97482: Calling groups_inventory to load vars for managed_node3 30582 1726855276.97484: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855276.97492: Calling all_plugins_play to load vars for managed_node3 30582 1726855276.97493: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855276.97495: Calling groups_plugins_play to load vars for managed_node3 30582 1726855276.98332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855276.99448: done with get_vars() 30582 1726855276.99463: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 14:01:16 -0400 (0:00:00.069) 0:00:13.345 ****** 30582 1726855276.99526: entering _queue_task() for managed_node3/include_tasks 30582 1726855276.99779: worker is 1 (out of 1 available) 30582 1726855276.99797: exiting _queue_task() for managed_node3/include_tasks 30582 1726855276.99809: done queuing things up, now waiting for results queue to drain 30582 1726855276.99811: waiting for pending results... 30582 1726855276.99995: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30582 1726855277.00083: in run() - task 0affcc66-ac2b-aa83-7d57-000000000383 30582 1726855277.00100: variable 'ansible_search_path' from source: unknown 30582 1726855277.00103: variable 'ansible_search_path' from source: unknown 30582 1726855277.00133: calling self._execute() 30582 1726855277.00196: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.00200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.00208: variable 'omit' from source: magic vars 30582 1726855277.00476: variable 'ansible_distribution_major_version' from source: facts 30582 1726855277.00490: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855277.00496: _execute() done 30582 1726855277.00499: dumping result to json 30582 1726855277.00503: done dumping result, returning 30582 1726855277.00509: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-aa83-7d57-000000000383] 30582 1726855277.00514: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000383 30582 1726855277.00599: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000383 30582 1726855277.00602: WORKER PROCESS EXITING 30582 1726855277.00629: no more pending results, returning what we have 30582 1726855277.00634: in VariableManager get_vars() 30582 1726855277.00668: Calling all_inventory to load vars for managed_node3 30582 1726855277.00670: Calling groups_inventory to load vars for managed_node3 30582 1726855277.00674: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855277.00689: Calling all_plugins_play to load vars for managed_node3 30582 1726855277.00692: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855277.00695: Calling groups_plugins_play to load vars for managed_node3 30582 1726855277.01470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855277.02318: done with get_vars() 30582 1726855277.02331: variable 'ansible_search_path' from source: unknown 30582 1726855277.02332: variable 'ansible_search_path' from source: unknown 30582 1726855277.02338: variable 'item' from source: include params 30582 1726855277.02414: variable 'item' from source: include params 30582 1726855277.02438: we have included files to process 30582 1726855277.02439: generating all_blocks data 30582 1726855277.02440: done generating all_blocks data 30582 1726855277.02441: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855277.02442: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855277.02443: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855277.03118: done processing included file 30582 1726855277.03120: iterating over new_blocks loaded from include file 30582 1726855277.03121: in VariableManager get_vars() 30582 1726855277.03131: done with get_vars() 30582 1726855277.03132: filtering new block on tags 30582 1726855277.03214: done filtering new block on tags 30582 1726855277.03217: in VariableManager get_vars() 30582 1726855277.03225: done with get_vars() 30582 1726855277.03226: filtering new block on tags 30582 1726855277.03257: done filtering new block on tags 30582 1726855277.03258: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30582 1726855277.03262: extending task lists for all hosts with included blocks 30582 1726855277.03415: done extending task lists 30582 1726855277.03416: done processing included files 30582 1726855277.03417: results queue empty 30582 1726855277.03417: checking for any_errors_fatal 30582 1726855277.03420: done checking for any_errors_fatal 30582 1726855277.03420: checking for max_fail_percentage 30582 1726855277.03421: done checking for max_fail_percentage 30582 1726855277.03421: checking to see if all hosts have failed and the running result is not ok 30582 1726855277.03422: done checking to see if all hosts have failed 30582 1726855277.03422: getting the remaining hosts for this loop 30582 1726855277.03423: done getting the remaining hosts for this loop 30582 1726855277.03425: getting the next task for host managed_node3 30582 1726855277.03428: done getting next task for host managed_node3 30582 1726855277.03430: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855277.03432: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855277.03433: getting variables 30582 1726855277.03434: in VariableManager get_vars() 30582 1726855277.03440: Calling all_inventory to load vars for managed_node3 30582 1726855277.03441: Calling groups_inventory to load vars for managed_node3 30582 1726855277.03442: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855277.03446: Calling all_plugins_play to load vars for managed_node3 30582 1726855277.03448: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855277.03449: Calling groups_plugins_play to load vars for managed_node3 30582 1726855277.07268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855277.08506: done with get_vars() 30582 1726855277.08523: done getting variables 30582 1726855277.08553: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 14:01:17 -0400 (0:00:00.090) 0:00:13.435 ****** 30582 1726855277.08572: entering _queue_task() for managed_node3/set_fact 30582 1726855277.08841: worker is 1 (out of 1 available) 30582 1726855277.08853: exiting _queue_task() for managed_node3/set_fact 30582 1726855277.08864: done queuing things up, now waiting for results queue to drain 30582 1726855277.08866: waiting for pending results... 30582 1726855277.09047: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855277.09140: in run() - task 0affcc66-ac2b-aa83-7d57-0000000003fe 30582 1726855277.09152: variable 'ansible_search_path' from source: unknown 30582 1726855277.09155: variable 'ansible_search_path' from source: unknown 30582 1726855277.09183: calling self._execute() 30582 1726855277.09249: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.09253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.09263: variable 'omit' from source: magic vars 30582 1726855277.09533: variable 'ansible_distribution_major_version' from source: facts 30582 1726855277.09542: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855277.09548: variable 'omit' from source: magic vars 30582 1726855277.09584: variable 'omit' from source: magic vars 30582 1726855277.09609: variable 'omit' from source: magic vars 30582 1726855277.09649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855277.09669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855277.09684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855277.09702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855277.09712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855277.09735: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855277.09738: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.09743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.09813: Set connection var ansible_timeout to 10 30582 1726855277.09817: Set connection var ansible_connection to ssh 30582 1726855277.09822: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855277.09827: Set connection var ansible_pipelining to False 30582 1726855277.09832: Set connection var ansible_shell_executable to /bin/sh 30582 1726855277.09834: Set connection var ansible_shell_type to sh 30582 1726855277.09851: variable 'ansible_shell_executable' from source: unknown 30582 1726855277.09855: variable 'ansible_connection' from source: unknown 30582 1726855277.09858: variable 'ansible_module_compression' from source: unknown 30582 1726855277.09861: variable 'ansible_shell_type' from source: unknown 30582 1726855277.09863: variable 'ansible_shell_executable' from source: unknown 30582 1726855277.09866: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.09868: variable 'ansible_pipelining' from source: unknown 30582 1726855277.09870: variable 'ansible_timeout' from source: unknown 30582 1726855277.09874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.10004: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855277.10008: variable 'omit' from source: magic vars 30582 1726855277.10010: starting attempt loop 30582 1726855277.10013: running the handler 30582 1726855277.10192: handler run complete 30582 1726855277.10196: attempt loop complete, returning result 30582 1726855277.10198: _execute() done 30582 1726855277.10200: dumping result to json 30582 1726855277.10202: done dumping result, returning 30582 1726855277.10205: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-aa83-7d57-0000000003fe] 30582 1726855277.10206: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000003fe 30582 1726855277.10269: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000003fe 30582 1726855277.10273: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30582 1726855277.10525: no more pending results, returning what we have 30582 1726855277.10528: results queue empty 30582 1726855277.10528: checking for any_errors_fatal 30582 1726855277.10530: done checking for any_errors_fatal 30582 1726855277.10530: checking for max_fail_percentage 30582 1726855277.10532: done checking for max_fail_percentage 30582 1726855277.10532: checking to see if all hosts have failed and the running result is not ok 30582 1726855277.10533: done checking to see if all hosts have failed 30582 1726855277.10534: getting the remaining hosts for this loop 30582 1726855277.10535: done getting the remaining hosts for this loop 30582 1726855277.10538: getting the next task for host managed_node3 30582 1726855277.10545: done getting next task for host managed_node3 30582 1726855277.10547: ^ task is: TASK: Stat profile file 30582 1726855277.10552: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855277.10557: getting variables 30582 1726855277.10558: in VariableManager get_vars() 30582 1726855277.10584: Calling all_inventory to load vars for managed_node3 30582 1726855277.10678: Calling groups_inventory to load vars for managed_node3 30582 1726855277.10682: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855277.10697: Calling all_plugins_play to load vars for managed_node3 30582 1726855277.10700: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855277.10704: Calling groups_plugins_play to load vars for managed_node3 30582 1726855277.11992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855277.13719: done with get_vars() 30582 1726855277.13744: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 14:01:17 -0400 (0:00:00.052) 0:00:13.488 ****** 30582 1726855277.13847: entering _queue_task() for managed_node3/stat 30582 1726855277.14250: worker is 1 (out of 1 available) 30582 1726855277.14263: exiting _queue_task() for managed_node3/stat 30582 1726855277.14276: done queuing things up, now waiting for results queue to drain 30582 1726855277.14278: waiting for pending results... 30582 1726855277.14533: running TaskExecutor() for managed_node3/TASK: Stat profile file 30582 1726855277.14777: in run() - task 0affcc66-ac2b-aa83-7d57-0000000003ff 30582 1726855277.14781: variable 'ansible_search_path' from source: unknown 30582 1726855277.14784: variable 'ansible_search_path' from source: unknown 30582 1726855277.14793: calling self._execute() 30582 1726855277.14849: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.14860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.14875: variable 'omit' from source: magic vars 30582 1726855277.15310: variable 'ansible_distribution_major_version' from source: facts 30582 1726855277.15328: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855277.15389: variable 'omit' from source: magic vars 30582 1726855277.15408: variable 'omit' from source: magic vars 30582 1726855277.15516: variable 'profile' from source: play vars 30582 1726855277.15527: variable 'interface' from source: play vars 30582 1726855277.15794: variable 'interface' from source: play vars 30582 1726855277.15797: variable 'omit' from source: magic vars 30582 1726855277.16099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855277.16103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855277.16105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855277.16107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855277.16109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855277.16112: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855277.16113: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.16115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.16267: Set connection var ansible_timeout to 10 30582 1726855277.16344: Set connection var ansible_connection to ssh 30582 1726855277.16358: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855277.16367: Set connection var ansible_pipelining to False 30582 1726855277.16375: Set connection var ansible_shell_executable to /bin/sh 30582 1726855277.16381: Set connection var ansible_shell_type to sh 30582 1726855277.16415: variable 'ansible_shell_executable' from source: unknown 30582 1726855277.16452: variable 'ansible_connection' from source: unknown 30582 1726855277.16459: variable 'ansible_module_compression' from source: unknown 30582 1726855277.16466: variable 'ansible_shell_type' from source: unknown 30582 1726855277.16592: variable 'ansible_shell_executable' from source: unknown 30582 1726855277.16595: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.16598: variable 'ansible_pipelining' from source: unknown 30582 1726855277.16600: variable 'ansible_timeout' from source: unknown 30582 1726855277.16602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.17023: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855277.17039: variable 'omit' from source: magic vars 30582 1726855277.17049: starting attempt loop 30582 1726855277.17059: running the handler 30582 1726855277.17078: _low_level_execute_command(): starting 30582 1726855277.17095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855277.17926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855277.17946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855277.17961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.17992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855277.18094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855277.18118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.18216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.20001: stdout chunk (state=3): >>>/root <<< 30582 1726855277.20493: stdout chunk (state=3): >>><<< 30582 1726855277.20497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.20500: stderr chunk (state=3): >>><<< 30582 1726855277.20504: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855277.20507: _low_level_execute_command(): starting 30582 1726855277.20510: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299 `" && echo ansible-tmp-1726855277.2040133-31184-43861946950299="` echo /root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299 `" ) && sleep 0' 30582 1726855277.21538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855277.21554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.21574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.21922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855277.22014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.22017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.23927: stdout chunk (state=3): >>>ansible-tmp-1726855277.2040133-31184-43861946950299=/root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299 <<< 30582 1726855277.24036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.24073: stderr chunk (state=3): >>><<< 30582 1726855277.24083: stdout chunk (state=3): >>><<< 30582 1726855277.24311: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855277.2040133-31184-43861946950299=/root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855277.24315: variable 'ansible_module_compression' from source: unknown 30582 1726855277.24419: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855277.24464: variable 'ansible_facts' from source: unknown 30582 1726855277.24658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/AnsiballZ_stat.py 30582 1726855277.25145: Sending initial data 30582 1726855277.25148: Sent initial data (152 bytes) 30582 1726855277.26123: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855277.26193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.26407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855277.26420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.26520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.28313: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855277.28327: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30582 1726855277.28477: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855277.28527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855277.28594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpz4ih2y7o /root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/AnsiballZ_stat.py <<< 30582 1726855277.28597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/AnsiballZ_stat.py" <<< 30582 1726855277.28646: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpz4ih2y7o" to remote "/root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/AnsiballZ_stat.py" <<< 30582 1726855277.30114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.30172: stderr chunk (state=3): >>><<< 30582 1726855277.30183: stdout chunk (state=3): >>><<< 30582 1726855277.30250: done transferring module to remote 30582 1726855277.30306: _low_level_execute_command(): starting 30582 1726855277.30316: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/ /root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/AnsiballZ_stat.py && sleep 0' 30582 1726855277.31415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.31419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855277.31421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.31423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.31425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.31484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855277.31510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855277.31514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.31572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.33373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.33436: stderr chunk (state=3): >>><<< 30582 1726855277.33445: stdout chunk (state=3): >>><<< 30582 1726855277.33471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855277.33482: _low_level_execute_command(): starting 30582 1726855277.33494: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/AnsiballZ_stat.py && sleep 0' 30582 1726855277.34340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855277.34356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855277.34372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.34482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855277.34516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.34626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.49613: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855277.51021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855277.51025: stdout chunk (state=3): >>><<< 30582 1726855277.51028: stderr chunk (state=3): >>><<< 30582 1726855277.51166: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855277.51170: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855277.51173: _low_level_execute_command(): starting 30582 1726855277.51175: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855277.2040133-31184-43861946950299/ > /dev/null 2>&1 && sleep 0' 30582 1726855277.51692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.51725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855277.51754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.51832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.53678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.53893: stderr chunk (state=3): >>><<< 30582 1726855277.53896: stdout chunk (state=3): >>><<< 30582 1726855277.53898: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855277.53900: handler run complete 30582 1726855277.53901: attempt loop complete, returning result 30582 1726855277.53903: _execute() done 30582 1726855277.53904: dumping result to json 30582 1726855277.53906: done dumping result, returning 30582 1726855277.53908: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcc66-ac2b-aa83-7d57-0000000003ff] 30582 1726855277.53909: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000003ff 30582 1726855277.53970: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000003ff 30582 1726855277.53973: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855277.54032: no more pending results, returning what we have 30582 1726855277.54036: results queue empty 30582 1726855277.54037: checking for any_errors_fatal 30582 1726855277.54042: done checking for any_errors_fatal 30582 1726855277.54042: checking for max_fail_percentage 30582 1726855277.54044: done checking for max_fail_percentage 30582 1726855277.54045: checking to see if all hosts have failed and the running result is not ok 30582 1726855277.54045: done checking to see if all hosts have failed 30582 1726855277.54046: getting the remaining hosts for this loop 30582 1726855277.54047: done getting the remaining hosts for this loop 30582 1726855277.54051: getting the next task for host managed_node3 30582 1726855277.54057: done getting next task for host managed_node3 30582 1726855277.54060: ^ task is: TASK: Set NM profile exist flag based on the profile files 30582 1726855277.54065: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855277.54068: getting variables 30582 1726855277.54069: in VariableManager get_vars() 30582 1726855277.54104: Calling all_inventory to load vars for managed_node3 30582 1726855277.54107: Calling groups_inventory to load vars for managed_node3 30582 1726855277.54109: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855277.54119: Calling all_plugins_play to load vars for managed_node3 30582 1726855277.54121: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855277.54124: Calling groups_plugins_play to load vars for managed_node3 30582 1726855277.55353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855277.56764: done with get_vars() 30582 1726855277.56790: done getting variables 30582 1726855277.56835: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 14:01:17 -0400 (0:00:00.430) 0:00:13.918 ****** 30582 1726855277.56861: entering _queue_task() for managed_node3/set_fact 30582 1726855277.57123: worker is 1 (out of 1 available) 30582 1726855277.57136: exiting _queue_task() for managed_node3/set_fact 30582 1726855277.57148: done queuing things up, now waiting for results queue to drain 30582 1726855277.57149: waiting for pending results... 30582 1726855277.57339: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30582 1726855277.57422: in run() - task 0affcc66-ac2b-aa83-7d57-000000000400 30582 1726855277.57432: variable 'ansible_search_path' from source: unknown 30582 1726855277.57435: variable 'ansible_search_path' from source: unknown 30582 1726855277.57463: calling self._execute() 30582 1726855277.57538: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.57542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.57550: variable 'omit' from source: magic vars 30582 1726855277.57844: variable 'ansible_distribution_major_version' from source: facts 30582 1726855277.57853: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855277.57940: variable 'profile_stat' from source: set_fact 30582 1726855277.57948: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855277.57951: when evaluation is False, skipping this task 30582 1726855277.57954: _execute() done 30582 1726855277.57957: dumping result to json 30582 1726855277.57959: done dumping result, returning 30582 1726855277.57967: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-aa83-7d57-000000000400] 30582 1726855277.57971: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000400 30582 1726855277.58055: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000400 30582 1726855277.58058: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855277.58105: no more pending results, returning what we have 30582 1726855277.58109: results queue empty 30582 1726855277.58110: checking for any_errors_fatal 30582 1726855277.58120: done checking for any_errors_fatal 30582 1726855277.58121: checking for max_fail_percentage 30582 1726855277.58123: done checking for max_fail_percentage 30582 1726855277.58124: checking to see if all hosts have failed and the running result is not ok 30582 1726855277.58125: done checking to see if all hosts have failed 30582 1726855277.58125: getting the remaining hosts for this loop 30582 1726855277.58127: done getting the remaining hosts for this loop 30582 1726855277.58130: getting the next task for host managed_node3 30582 1726855277.58140: done getting next task for host managed_node3 30582 1726855277.58142: ^ task is: TASK: Get NM profile info 30582 1726855277.58147: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855277.58152: getting variables 30582 1726855277.58153: in VariableManager get_vars() 30582 1726855277.58193: Calling all_inventory to load vars for managed_node3 30582 1726855277.58197: Calling groups_inventory to load vars for managed_node3 30582 1726855277.58200: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855277.58219: Calling all_plugins_play to load vars for managed_node3 30582 1726855277.58222: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855277.58225: Calling groups_plugins_play to load vars for managed_node3 30582 1726855277.59617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855277.60460: done with get_vars() 30582 1726855277.60476: done getting variables 30582 1726855277.60545: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 14:01:17 -0400 (0:00:00.037) 0:00:13.955 ****** 30582 1726855277.60566: entering _queue_task() for managed_node3/shell 30582 1726855277.60568: Creating lock for shell 30582 1726855277.60823: worker is 1 (out of 1 available) 30582 1726855277.60838: exiting _queue_task() for managed_node3/shell 30582 1726855277.60848: done queuing things up, now waiting for results queue to drain 30582 1726855277.60849: waiting for pending results... 30582 1726855277.61031: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30582 1726855277.61112: in run() - task 0affcc66-ac2b-aa83-7d57-000000000401 30582 1726855277.61123: variable 'ansible_search_path' from source: unknown 30582 1726855277.61126: variable 'ansible_search_path' from source: unknown 30582 1726855277.61154: calling self._execute() 30582 1726855277.61223: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.61227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.61236: variable 'omit' from source: magic vars 30582 1726855277.61518: variable 'ansible_distribution_major_version' from source: facts 30582 1726855277.61522: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855277.61528: variable 'omit' from source: magic vars 30582 1726855277.61563: variable 'omit' from source: magic vars 30582 1726855277.61634: variable 'profile' from source: play vars 30582 1726855277.61637: variable 'interface' from source: play vars 30582 1726855277.61691: variable 'interface' from source: play vars 30582 1726855277.61706: variable 'omit' from source: magic vars 30582 1726855277.61741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855277.61767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855277.61783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855277.61800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855277.61809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855277.61834: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855277.61837: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.61840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.61915: Set connection var ansible_timeout to 10 30582 1726855277.61918: Set connection var ansible_connection to ssh 30582 1726855277.61923: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855277.61928: Set connection var ansible_pipelining to False 30582 1726855277.61933: Set connection var ansible_shell_executable to /bin/sh 30582 1726855277.61936: Set connection var ansible_shell_type to sh 30582 1726855277.61958: variable 'ansible_shell_executable' from source: unknown 30582 1726855277.61961: variable 'ansible_connection' from source: unknown 30582 1726855277.61964: variable 'ansible_module_compression' from source: unknown 30582 1726855277.61966: variable 'ansible_shell_type' from source: unknown 30582 1726855277.61968: variable 'ansible_shell_executable' from source: unknown 30582 1726855277.61971: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.61973: variable 'ansible_pipelining' from source: unknown 30582 1726855277.61976: variable 'ansible_timeout' from source: unknown 30582 1726855277.61978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.62078: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855277.62086: variable 'omit' from source: magic vars 30582 1726855277.62096: starting attempt loop 30582 1726855277.62099: running the handler 30582 1726855277.62108: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855277.62123: _low_level_execute_command(): starting 30582 1726855277.62130: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855277.62649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.62653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855277.62657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.62701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855277.62717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.62785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.64476: stdout chunk (state=3): >>>/root <<< 30582 1726855277.64574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.64608: stderr chunk (state=3): >>><<< 30582 1726855277.64611: stdout chunk (state=3): >>><<< 30582 1726855277.64634: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855277.64651: _low_level_execute_command(): starting 30582 1726855277.64655: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955 `" && echo ansible-tmp-1726855277.6463392-31210-186687986040955="` echo /root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955 `" ) && sleep 0' 30582 1726855277.65122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.65133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855277.65136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.65139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.65141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.65182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855277.65185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855277.65194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.65256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.67153: stdout chunk (state=3): >>>ansible-tmp-1726855277.6463392-31210-186687986040955=/root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955 <<< 30582 1726855277.67316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.67320: stdout chunk (state=3): >>><<< 30582 1726855277.67322: stderr chunk (state=3): >>><<< 30582 1726855277.67347: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855277.6463392-31210-186687986040955=/root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855277.67462: variable 'ansible_module_compression' from source: unknown 30582 1726855277.67465: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855277.67509: variable 'ansible_facts' from source: unknown 30582 1726855277.67608: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/AnsiballZ_command.py 30582 1726855277.67819: Sending initial data 30582 1726855277.67823: Sent initial data (156 bytes) 30582 1726855277.68438: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855277.68498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.68578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855277.68606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.68703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.70275: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855277.70282: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855277.70331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855277.70394: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp0aah_2zc /root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/AnsiballZ_command.py <<< 30582 1726855277.70398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/AnsiballZ_command.py" <<< 30582 1726855277.70453: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp0aah_2zc" to remote "/root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/AnsiballZ_command.py" <<< 30582 1726855277.71042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.71082: stderr chunk (state=3): >>><<< 30582 1726855277.71090: stdout chunk (state=3): >>><<< 30582 1726855277.71113: done transferring module to remote 30582 1726855277.71122: _low_level_execute_command(): starting 30582 1726855277.71127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/ /root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/AnsiballZ_command.py && sleep 0' 30582 1726855277.71566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.71569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.71571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855277.71577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855277.71579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.71622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855277.71625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.71692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.73450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.73475: stderr chunk (state=3): >>><<< 30582 1726855277.73478: stdout chunk (state=3): >>><<< 30582 1726855277.73501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855277.73504: _low_level_execute_command(): starting 30582 1726855277.73507: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/AnsiballZ_command.py && sleep 0' 30582 1726855277.73959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855277.73963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.73965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855277.73967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855277.73969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.74017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855277.74021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855277.74023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.74091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.91134: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:01:17.892028", "end": "2024-09-20 14:01:17.910235", "delta": "0:00:00.018207", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855277.92731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855277.92757: stderr chunk (state=3): >>><<< 30582 1726855277.92761: stdout chunk (state=3): >>><<< 30582 1726855277.92782: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:01:17.892028", "end": "2024-09-20 14:01:17.910235", "delta": "0:00:00.018207", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855277.92815: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855277.92822: _low_level_execute_command(): starting 30582 1726855277.92827: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855277.6463392-31210-186687986040955/ > /dev/null 2>&1 && sleep 0' 30582 1726855277.93266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.93301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855277.93304: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855277.93308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.93310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855277.93313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855277.93316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855277.93365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855277.93369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855277.93371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855277.93436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855277.95264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855277.95297: stderr chunk (state=3): >>><<< 30582 1726855277.95301: stdout chunk (state=3): >>><<< 30582 1726855277.95314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855277.95320: handler run complete 30582 1726855277.95339: Evaluated conditional (False): False 30582 1726855277.95347: attempt loop complete, returning result 30582 1726855277.95350: _execute() done 30582 1726855277.95352: dumping result to json 30582 1726855277.95357: done dumping result, returning 30582 1726855277.95364: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcc66-ac2b-aa83-7d57-000000000401] 30582 1726855277.95371: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000401 30582 1726855277.95470: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000401 30582 1726855277.95475: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018207", "end": "2024-09-20 14:01:17.910235", "rc": 0, "start": "2024-09-20 14:01:17.892028" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30582 1726855277.95544: no more pending results, returning what we have 30582 1726855277.95547: results queue empty 30582 1726855277.95548: checking for any_errors_fatal 30582 1726855277.95556: done checking for any_errors_fatal 30582 1726855277.95557: checking for max_fail_percentage 30582 1726855277.95559: done checking for max_fail_percentage 30582 1726855277.95560: checking to see if all hosts have failed and the running result is not ok 30582 1726855277.95561: done checking to see if all hosts have failed 30582 1726855277.95561: getting the remaining hosts for this loop 30582 1726855277.95563: done getting the remaining hosts for this loop 30582 1726855277.95566: getting the next task for host managed_node3 30582 1726855277.95574: done getting next task for host managed_node3 30582 1726855277.95576: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855277.95581: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855277.95586: getting variables 30582 1726855277.95596: in VariableManager get_vars() 30582 1726855277.95628: Calling all_inventory to load vars for managed_node3 30582 1726855277.95631: Calling groups_inventory to load vars for managed_node3 30582 1726855277.95634: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855277.95645: Calling all_plugins_play to load vars for managed_node3 30582 1726855277.95648: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855277.95651: Calling groups_plugins_play to load vars for managed_node3 30582 1726855277.96449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855277.97311: done with get_vars() 30582 1726855277.97332: done getting variables 30582 1726855277.97376: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 14:01:17 -0400 (0:00:00.368) 0:00:14.323 ****** 30582 1726855277.97405: entering _queue_task() for managed_node3/set_fact 30582 1726855277.97661: worker is 1 (out of 1 available) 30582 1726855277.97674: exiting _queue_task() for managed_node3/set_fact 30582 1726855277.97685: done queuing things up, now waiting for results queue to drain 30582 1726855277.97686: waiting for pending results... 30582 1726855277.97869: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855277.97952: in run() - task 0affcc66-ac2b-aa83-7d57-000000000402 30582 1726855277.97963: variable 'ansible_search_path' from source: unknown 30582 1726855277.97966: variable 'ansible_search_path' from source: unknown 30582 1726855277.98000: calling self._execute() 30582 1726855277.98063: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.98067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.98077: variable 'omit' from source: magic vars 30582 1726855277.98359: variable 'ansible_distribution_major_version' from source: facts 30582 1726855277.98369: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855277.98465: variable 'nm_profile_exists' from source: set_fact 30582 1726855277.98472: Evaluated conditional (nm_profile_exists.rc == 0): True 30582 1726855277.98478: variable 'omit' from source: magic vars 30582 1726855277.98517: variable 'omit' from source: magic vars 30582 1726855277.98538: variable 'omit' from source: magic vars 30582 1726855277.98569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855277.98606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855277.98622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855277.98636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855277.98645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855277.98669: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855277.98673: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.98687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.98755: Set connection var ansible_timeout to 10 30582 1726855277.98759: Set connection var ansible_connection to ssh 30582 1726855277.98764: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855277.98769: Set connection var ansible_pipelining to False 30582 1726855277.98774: Set connection var ansible_shell_executable to /bin/sh 30582 1726855277.98776: Set connection var ansible_shell_type to sh 30582 1726855277.98798: variable 'ansible_shell_executable' from source: unknown 30582 1726855277.98801: variable 'ansible_connection' from source: unknown 30582 1726855277.98804: variable 'ansible_module_compression' from source: unknown 30582 1726855277.98806: variable 'ansible_shell_type' from source: unknown 30582 1726855277.98808: variable 'ansible_shell_executable' from source: unknown 30582 1726855277.98812: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855277.98815: variable 'ansible_pipelining' from source: unknown 30582 1726855277.98818: variable 'ansible_timeout' from source: unknown 30582 1726855277.98821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855277.98925: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855277.98933: variable 'omit' from source: magic vars 30582 1726855277.98939: starting attempt loop 30582 1726855277.98941: running the handler 30582 1726855277.98952: handler run complete 30582 1726855277.98960: attempt loop complete, returning result 30582 1726855277.98963: _execute() done 30582 1726855277.98965: dumping result to json 30582 1726855277.98968: done dumping result, returning 30582 1726855277.98976: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-aa83-7d57-000000000402] 30582 1726855277.98980: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000402 30582 1726855277.99061: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000402 30582 1726855277.99064: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30582 1726855277.99119: no more pending results, returning what we have 30582 1726855277.99122: results queue empty 30582 1726855277.99123: checking for any_errors_fatal 30582 1726855277.99132: done checking for any_errors_fatal 30582 1726855277.99132: checking for max_fail_percentage 30582 1726855277.99135: done checking for max_fail_percentage 30582 1726855277.99135: checking to see if all hosts have failed and the running result is not ok 30582 1726855277.99136: done checking to see if all hosts have failed 30582 1726855277.99137: getting the remaining hosts for this loop 30582 1726855277.99138: done getting the remaining hosts for this loop 30582 1726855277.99142: getting the next task for host managed_node3 30582 1726855277.99152: done getting next task for host managed_node3 30582 1726855277.99155: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855277.99160: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855277.99164: getting variables 30582 1726855277.99165: in VariableManager get_vars() 30582 1726855277.99205: Calling all_inventory to load vars for managed_node3 30582 1726855277.99208: Calling groups_inventory to load vars for managed_node3 30582 1726855277.99211: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855277.99220: Calling all_plugins_play to load vars for managed_node3 30582 1726855277.99223: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855277.99226: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.00126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.00981: done with get_vars() 30582 1726855278.01000: done getting variables 30582 1726855278.01043: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855278.01133: variable 'profile' from source: play vars 30582 1726855278.01136: variable 'interface' from source: play vars 30582 1726855278.01183: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 14:01:18 -0400 (0:00:00.038) 0:00:14.362 ****** 30582 1726855278.01210: entering _queue_task() for managed_node3/command 30582 1726855278.01445: worker is 1 (out of 1 available) 30582 1726855278.01459: exiting _queue_task() for managed_node3/command 30582 1726855278.01469: done queuing things up, now waiting for results queue to drain 30582 1726855278.01470: waiting for pending results... 30582 1726855278.01643: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30582 1726855278.01729: in run() - task 0affcc66-ac2b-aa83-7d57-000000000404 30582 1726855278.01740: variable 'ansible_search_path' from source: unknown 30582 1726855278.01744: variable 'ansible_search_path' from source: unknown 30582 1726855278.01771: calling self._execute() 30582 1726855278.01843: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.01846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.01856: variable 'omit' from source: magic vars 30582 1726855278.02125: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.02140: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.02219: variable 'profile_stat' from source: set_fact 30582 1726855278.02228: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855278.02231: when evaluation is False, skipping this task 30582 1726855278.02233: _execute() done 30582 1726855278.02236: dumping result to json 30582 1726855278.02245: done dumping result, returning 30582 1726855278.02248: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000000404] 30582 1726855278.02254: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000404 30582 1726855278.02336: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000404 30582 1726855278.02338: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855278.02404: no more pending results, returning what we have 30582 1726855278.02408: results queue empty 30582 1726855278.02409: checking for any_errors_fatal 30582 1726855278.02417: done checking for any_errors_fatal 30582 1726855278.02417: checking for max_fail_percentage 30582 1726855278.02419: done checking for max_fail_percentage 30582 1726855278.02420: checking to see if all hosts have failed and the running result is not ok 30582 1726855278.02420: done checking to see if all hosts have failed 30582 1726855278.02421: getting the remaining hosts for this loop 30582 1726855278.02422: done getting the remaining hosts for this loop 30582 1726855278.02426: getting the next task for host managed_node3 30582 1726855278.02433: done getting next task for host managed_node3 30582 1726855278.02435: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855278.02440: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855278.02443: getting variables 30582 1726855278.02445: in VariableManager get_vars() 30582 1726855278.02472: Calling all_inventory to load vars for managed_node3 30582 1726855278.02475: Calling groups_inventory to load vars for managed_node3 30582 1726855278.02477: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.02491: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.02494: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.02503: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.03280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.04259: done with get_vars() 30582 1726855278.04276: done getting variables 30582 1726855278.04325: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855278.04412: variable 'profile' from source: play vars 30582 1726855278.04416: variable 'interface' from source: play vars 30582 1726855278.04457: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 14:01:18 -0400 (0:00:00.032) 0:00:14.394 ****** 30582 1726855278.04481: entering _queue_task() for managed_node3/set_fact 30582 1726855278.04739: worker is 1 (out of 1 available) 30582 1726855278.04751: exiting _queue_task() for managed_node3/set_fact 30582 1726855278.04762: done queuing things up, now waiting for results queue to drain 30582 1726855278.04763: waiting for pending results... 30582 1726855278.04946: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30582 1726855278.05028: in run() - task 0affcc66-ac2b-aa83-7d57-000000000405 30582 1726855278.05038: variable 'ansible_search_path' from source: unknown 30582 1726855278.05041: variable 'ansible_search_path' from source: unknown 30582 1726855278.05069: calling self._execute() 30582 1726855278.05140: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.05143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.05152: variable 'omit' from source: magic vars 30582 1726855278.05424: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.05437: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.05518: variable 'profile_stat' from source: set_fact 30582 1726855278.05526: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855278.05529: when evaluation is False, skipping this task 30582 1726855278.05532: _execute() done 30582 1726855278.05536: dumping result to json 30582 1726855278.05540: done dumping result, returning 30582 1726855278.05551: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000000405] 30582 1726855278.05554: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000405 30582 1726855278.05637: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000405 30582 1726855278.05639: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855278.05701: no more pending results, returning what we have 30582 1726855278.05705: results queue empty 30582 1726855278.05706: checking for any_errors_fatal 30582 1726855278.05712: done checking for any_errors_fatal 30582 1726855278.05713: checking for max_fail_percentage 30582 1726855278.05715: done checking for max_fail_percentage 30582 1726855278.05716: checking to see if all hosts have failed and the running result is not ok 30582 1726855278.05716: done checking to see if all hosts have failed 30582 1726855278.05717: getting the remaining hosts for this loop 30582 1726855278.05718: done getting the remaining hosts for this loop 30582 1726855278.05722: getting the next task for host managed_node3 30582 1726855278.05730: done getting next task for host managed_node3 30582 1726855278.05732: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30582 1726855278.05738: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855278.05742: getting variables 30582 1726855278.05743: in VariableManager get_vars() 30582 1726855278.05776: Calling all_inventory to load vars for managed_node3 30582 1726855278.05779: Calling groups_inventory to load vars for managed_node3 30582 1726855278.05782: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.05797: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.05799: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.05801: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.06591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.07456: done with get_vars() 30582 1726855278.07474: done getting variables 30582 1726855278.07520: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855278.07601: variable 'profile' from source: play vars 30582 1726855278.07604: variable 'interface' from source: play vars 30582 1726855278.07646: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 14:01:18 -0400 (0:00:00.031) 0:00:14.426 ****** 30582 1726855278.07670: entering _queue_task() for managed_node3/command 30582 1726855278.07909: worker is 1 (out of 1 available) 30582 1726855278.07922: exiting _queue_task() for managed_node3/command 30582 1726855278.07933: done queuing things up, now waiting for results queue to drain 30582 1726855278.07934: waiting for pending results... 30582 1726855278.08116: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30582 1726855278.08200: in run() - task 0affcc66-ac2b-aa83-7d57-000000000406 30582 1726855278.08210: variable 'ansible_search_path' from source: unknown 30582 1726855278.08213: variable 'ansible_search_path' from source: unknown 30582 1726855278.08243: calling self._execute() 30582 1726855278.08312: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.08317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.08325: variable 'omit' from source: magic vars 30582 1726855278.08584: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.08599: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.08677: variable 'profile_stat' from source: set_fact 30582 1726855278.08686: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855278.08692: when evaluation is False, skipping this task 30582 1726855278.08696: _execute() done 30582 1726855278.08699: dumping result to json 30582 1726855278.08702: done dumping result, returning 30582 1726855278.08711: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000000406] 30582 1726855278.08714: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000406 30582 1726855278.08794: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000406 30582 1726855278.08796: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855278.08857: no more pending results, returning what we have 30582 1726855278.08861: results queue empty 30582 1726855278.08862: checking for any_errors_fatal 30582 1726855278.08870: done checking for any_errors_fatal 30582 1726855278.08871: checking for max_fail_percentage 30582 1726855278.08873: done checking for max_fail_percentage 30582 1726855278.08874: checking to see if all hosts have failed and the running result is not ok 30582 1726855278.08874: done checking to see if all hosts have failed 30582 1726855278.08875: getting the remaining hosts for this loop 30582 1726855278.08876: done getting the remaining hosts for this loop 30582 1726855278.08880: getting the next task for host managed_node3 30582 1726855278.08890: done getting next task for host managed_node3 30582 1726855278.08892: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30582 1726855278.08896: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855278.08900: getting variables 30582 1726855278.08901: in VariableManager get_vars() 30582 1726855278.08927: Calling all_inventory to load vars for managed_node3 30582 1726855278.08929: Calling groups_inventory to load vars for managed_node3 30582 1726855278.08932: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.08942: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.08944: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.08947: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.09899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.10749: done with get_vars() 30582 1726855278.10764: done getting variables 30582 1726855278.10810: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855278.10892: variable 'profile' from source: play vars 30582 1726855278.10895: variable 'interface' from source: play vars 30582 1726855278.10934: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 14:01:18 -0400 (0:00:00.032) 0:00:14.459 ****** 30582 1726855278.10959: entering _queue_task() for managed_node3/set_fact 30582 1726855278.11382: worker is 1 (out of 1 available) 30582 1726855278.11397: exiting _queue_task() for managed_node3/set_fact 30582 1726855278.11406: done queuing things up, now waiting for results queue to drain 30582 1726855278.11408: waiting for pending results... 30582 1726855278.11695: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30582 1726855278.11766: in run() - task 0affcc66-ac2b-aa83-7d57-000000000407 30582 1726855278.11802: variable 'ansible_search_path' from source: unknown 30582 1726855278.11810: variable 'ansible_search_path' from source: unknown 30582 1726855278.11854: calling self._execute() 30582 1726855278.11954: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.11965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.11979: variable 'omit' from source: magic vars 30582 1726855278.12374: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.12399: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.12542: variable 'profile_stat' from source: set_fact 30582 1726855278.12546: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855278.12555: when evaluation is False, skipping this task 30582 1726855278.12558: _execute() done 30582 1726855278.12561: dumping result to json 30582 1726855278.12563: done dumping result, returning 30582 1726855278.12570: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000000407] 30582 1726855278.12574: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000407 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855278.12713: no more pending results, returning what we have 30582 1726855278.12717: results queue empty 30582 1726855278.12718: checking for any_errors_fatal 30582 1726855278.12724: done checking for any_errors_fatal 30582 1726855278.12725: checking for max_fail_percentage 30582 1726855278.12727: done checking for max_fail_percentage 30582 1726855278.12728: checking to see if all hosts have failed and the running result is not ok 30582 1726855278.12728: done checking to see if all hosts have failed 30582 1726855278.12729: getting the remaining hosts for this loop 30582 1726855278.12730: done getting the remaining hosts for this loop 30582 1726855278.12734: getting the next task for host managed_node3 30582 1726855278.12744: done getting next task for host managed_node3 30582 1726855278.12747: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30582 1726855278.12751: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855278.12755: getting variables 30582 1726855278.12757: in VariableManager get_vars() 30582 1726855278.12795: Calling all_inventory to load vars for managed_node3 30582 1726855278.12798: Calling groups_inventory to load vars for managed_node3 30582 1726855278.12801: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.12813: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.12815: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.12817: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.13400: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000407 30582 1726855278.13404: WORKER PROCESS EXITING 30582 1726855278.13618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.14612: done with get_vars() 30582 1726855278.14633: done getting variables 30582 1726855278.14696: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855278.14811: variable 'profile' from source: play vars 30582 1726855278.14815: variable 'interface' from source: play vars 30582 1726855278.14871: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 14:01:18 -0400 (0:00:00.039) 0:00:14.498 ****** 30582 1726855278.14908: entering _queue_task() for managed_node3/assert 30582 1726855278.15226: worker is 1 (out of 1 available) 30582 1726855278.15242: exiting _queue_task() for managed_node3/assert 30582 1726855278.15254: done queuing things up, now waiting for results queue to drain 30582 1726855278.15256: waiting for pending results... 30582 1726855278.15710: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' 30582 1726855278.15716: in run() - task 0affcc66-ac2b-aa83-7d57-000000000384 30582 1726855278.15720: variable 'ansible_search_path' from source: unknown 30582 1726855278.15723: variable 'ansible_search_path' from source: unknown 30582 1726855278.15743: calling self._execute() 30582 1726855278.15844: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.15856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.15871: variable 'omit' from source: magic vars 30582 1726855278.16223: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.16240: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.16251: variable 'omit' from source: magic vars 30582 1726855278.16313: variable 'omit' from source: magic vars 30582 1726855278.16410: variable 'profile' from source: play vars 30582 1726855278.16422: variable 'interface' from source: play vars 30582 1726855278.16490: variable 'interface' from source: play vars 30582 1726855278.16518: variable 'omit' from source: magic vars 30582 1726855278.16559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855278.16715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855278.16718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855278.16720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855278.16723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855278.16724: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855278.16727: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.16728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.16818: Set connection var ansible_timeout to 10 30582 1726855278.16828: Set connection var ansible_connection to ssh 30582 1726855278.16839: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855278.16848: Set connection var ansible_pipelining to False 30582 1726855278.16857: Set connection var ansible_shell_executable to /bin/sh 30582 1726855278.16863: Set connection var ansible_shell_type to sh 30582 1726855278.16933: variable 'ansible_shell_executable' from source: unknown 30582 1726855278.16936: variable 'ansible_connection' from source: unknown 30582 1726855278.16938: variable 'ansible_module_compression' from source: unknown 30582 1726855278.16940: variable 'ansible_shell_type' from source: unknown 30582 1726855278.16942: variable 'ansible_shell_executable' from source: unknown 30582 1726855278.16944: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.16946: variable 'ansible_pipelining' from source: unknown 30582 1726855278.16948: variable 'ansible_timeout' from source: unknown 30582 1726855278.16950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.17101: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855278.17116: variable 'omit' from source: magic vars 30582 1726855278.17126: starting attempt loop 30582 1726855278.17132: running the handler 30582 1726855278.17259: variable 'lsr_net_profile_exists' from source: set_fact 30582 1726855278.17261: Evaluated conditional (lsr_net_profile_exists): True 30582 1726855278.17264: handler run complete 30582 1726855278.17281: attempt loop complete, returning result 30582 1726855278.17368: _execute() done 30582 1726855278.17371: dumping result to json 30582 1726855278.17373: done dumping result, returning 30582 1726855278.17375: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' [0affcc66-ac2b-aa83-7d57-000000000384] 30582 1726855278.17377: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000384 30582 1726855278.17444: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000384 30582 1726855278.17448: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855278.17525: no more pending results, returning what we have 30582 1726855278.17529: results queue empty 30582 1726855278.17530: checking for any_errors_fatal 30582 1726855278.17537: done checking for any_errors_fatal 30582 1726855278.17538: checking for max_fail_percentage 30582 1726855278.17540: done checking for max_fail_percentage 30582 1726855278.17541: checking to see if all hosts have failed and the running result is not ok 30582 1726855278.17542: done checking to see if all hosts have failed 30582 1726855278.17542: getting the remaining hosts for this loop 30582 1726855278.17544: done getting the remaining hosts for this loop 30582 1726855278.17548: getting the next task for host managed_node3 30582 1726855278.17556: done getting next task for host managed_node3 30582 1726855278.17558: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30582 1726855278.17563: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855278.17568: getting variables 30582 1726855278.17570: in VariableManager get_vars() 30582 1726855278.17609: Calling all_inventory to load vars for managed_node3 30582 1726855278.17612: Calling groups_inventory to load vars for managed_node3 30582 1726855278.17616: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.17628: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.17631: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.17633: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.19381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.20963: done with get_vars() 30582 1726855278.20993: done getting variables 30582 1726855278.21052: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855278.21170: variable 'profile' from source: play vars 30582 1726855278.21174: variable 'interface' from source: play vars 30582 1726855278.21237: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 14:01:18 -0400 (0:00:00.063) 0:00:14.562 ****** 30582 1726855278.21275: entering _queue_task() for managed_node3/assert 30582 1726855278.21677: worker is 1 (out of 1 available) 30582 1726855278.21695: exiting _queue_task() for managed_node3/assert 30582 1726855278.21708: done queuing things up, now waiting for results queue to drain 30582 1726855278.21710: waiting for pending results... 30582 1726855278.22026: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' 30582 1726855278.22083: in run() - task 0affcc66-ac2b-aa83-7d57-000000000385 30582 1726855278.22108: variable 'ansible_search_path' from source: unknown 30582 1726855278.22121: variable 'ansible_search_path' from source: unknown 30582 1726855278.22160: calling self._execute() 30582 1726855278.22292: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.22296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.22298: variable 'omit' from source: magic vars 30582 1726855278.22635: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.22653: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.22674: variable 'omit' from source: magic vars 30582 1726855278.22723: variable 'omit' from source: magic vars 30582 1726855278.22880: variable 'profile' from source: play vars 30582 1726855278.22884: variable 'interface' from source: play vars 30582 1726855278.22907: variable 'interface' from source: play vars 30582 1726855278.22929: variable 'omit' from source: magic vars 30582 1726855278.22972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855278.23017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855278.23040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855278.23063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855278.23083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855278.23125: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855278.23202: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.23205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.23252: Set connection var ansible_timeout to 10 30582 1726855278.23259: Set connection var ansible_connection to ssh 30582 1726855278.23270: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855278.23279: Set connection var ansible_pipelining to False 30582 1726855278.23288: Set connection var ansible_shell_executable to /bin/sh 30582 1726855278.23296: Set connection var ansible_shell_type to sh 30582 1726855278.23364: variable 'ansible_shell_executable' from source: unknown 30582 1726855278.23372: variable 'ansible_connection' from source: unknown 30582 1726855278.23378: variable 'ansible_module_compression' from source: unknown 30582 1726855278.23384: variable 'ansible_shell_type' from source: unknown 30582 1726855278.23394: variable 'ansible_shell_executable' from source: unknown 30582 1726855278.23400: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.23409: variable 'ansible_pipelining' from source: unknown 30582 1726855278.23421: variable 'ansible_timeout' from source: unknown 30582 1726855278.23429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.23594: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855278.23597: variable 'omit' from source: magic vars 30582 1726855278.23599: starting attempt loop 30582 1726855278.23602: running the handler 30582 1726855278.23709: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30582 1726855278.23720: Evaluated conditional (lsr_net_profile_ansible_managed): True 30582 1726855278.23729: handler run complete 30582 1726855278.23855: attempt loop complete, returning result 30582 1726855278.23858: _execute() done 30582 1726855278.23860: dumping result to json 30582 1726855278.23862: done dumping result, returning 30582 1726855278.23864: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' [0affcc66-ac2b-aa83-7d57-000000000385] 30582 1726855278.23866: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000385 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855278.23989: no more pending results, returning what we have 30582 1726855278.23992: results queue empty 30582 1726855278.23993: checking for any_errors_fatal 30582 1726855278.24002: done checking for any_errors_fatal 30582 1726855278.24002: checking for max_fail_percentage 30582 1726855278.24005: done checking for max_fail_percentage 30582 1726855278.24006: checking to see if all hosts have failed and the running result is not ok 30582 1726855278.24007: done checking to see if all hosts have failed 30582 1726855278.24008: getting the remaining hosts for this loop 30582 1726855278.24010: done getting the remaining hosts for this loop 30582 1726855278.24014: getting the next task for host managed_node3 30582 1726855278.24022: done getting next task for host managed_node3 30582 1726855278.24025: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30582 1726855278.24030: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855278.24035: getting variables 30582 1726855278.24037: in VariableManager get_vars() 30582 1726855278.24073: Calling all_inventory to load vars for managed_node3 30582 1726855278.24076: Calling groups_inventory to load vars for managed_node3 30582 1726855278.24080: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.24200: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.24205: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.24210: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000385 30582 1726855278.24213: WORKER PROCESS EXITING 30582 1726855278.24218: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.26415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.28026: done with get_vars() 30582 1726855278.28056: done getting variables 30582 1726855278.28342: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855278.28447: variable 'profile' from source: play vars 30582 1726855278.28451: variable 'interface' from source: play vars 30582 1726855278.28718: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 14:01:18 -0400 (0:00:00.074) 0:00:14.637 ****** 30582 1726855278.28750: entering _queue_task() for managed_node3/assert 30582 1726855278.29460: worker is 1 (out of 1 available) 30582 1726855278.29470: exiting _queue_task() for managed_node3/assert 30582 1726855278.29481: done queuing things up, now waiting for results queue to drain 30582 1726855278.29483: waiting for pending results... 30582 1726855278.30004: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr 30582 1726855278.30397: in run() - task 0affcc66-ac2b-aa83-7d57-000000000386 30582 1726855278.30401: variable 'ansible_search_path' from source: unknown 30582 1726855278.30404: variable 'ansible_search_path' from source: unknown 30582 1726855278.30406: calling self._execute() 30582 1726855278.30409: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.30411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.30413: variable 'omit' from source: magic vars 30582 1726855278.31109: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.31121: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.31129: variable 'omit' from source: magic vars 30582 1726855278.31174: variable 'omit' from source: magic vars 30582 1726855278.31479: variable 'profile' from source: play vars 30582 1726855278.31483: variable 'interface' from source: play vars 30582 1726855278.31549: variable 'interface' from source: play vars 30582 1726855278.31568: variable 'omit' from source: magic vars 30582 1726855278.31814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855278.31850: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855278.31870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855278.31891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855278.31904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855278.31937: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855278.31940: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.31942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.32256: Set connection var ansible_timeout to 10 30582 1726855278.32259: Set connection var ansible_connection to ssh 30582 1726855278.32269: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855278.32275: Set connection var ansible_pipelining to False 30582 1726855278.32281: Set connection var ansible_shell_executable to /bin/sh 30582 1726855278.32283: Set connection var ansible_shell_type to sh 30582 1726855278.32314: variable 'ansible_shell_executable' from source: unknown 30582 1726855278.32317: variable 'ansible_connection' from source: unknown 30582 1726855278.32320: variable 'ansible_module_compression' from source: unknown 30582 1726855278.32322: variable 'ansible_shell_type' from source: unknown 30582 1726855278.32324: variable 'ansible_shell_executable' from source: unknown 30582 1726855278.32332: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.32334: variable 'ansible_pipelining' from source: unknown 30582 1726855278.32337: variable 'ansible_timeout' from source: unknown 30582 1726855278.32339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.32676: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855278.32686: variable 'omit' from source: magic vars 30582 1726855278.32697: starting attempt loop 30582 1726855278.32700: running the handler 30582 1726855278.33093: variable 'lsr_net_profile_fingerprint' from source: set_fact 30582 1726855278.33096: Evaluated conditional (lsr_net_profile_fingerprint): True 30582 1726855278.33098: handler run complete 30582 1726855278.33100: attempt loop complete, returning result 30582 1726855278.33102: _execute() done 30582 1726855278.33103: dumping result to json 30582 1726855278.33105: done dumping result, returning 30582 1726855278.33107: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr [0affcc66-ac2b-aa83-7d57-000000000386] 30582 1726855278.33108: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000386 30582 1726855278.33165: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000386 30582 1726855278.33167: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855278.33245: no more pending results, returning what we have 30582 1726855278.33249: results queue empty 30582 1726855278.33250: checking for any_errors_fatal 30582 1726855278.33258: done checking for any_errors_fatal 30582 1726855278.33259: checking for max_fail_percentage 30582 1726855278.33261: done checking for max_fail_percentage 30582 1726855278.33262: checking to see if all hosts have failed and the running result is not ok 30582 1726855278.33263: done checking to see if all hosts have failed 30582 1726855278.33264: getting the remaining hosts for this loop 30582 1726855278.33266: done getting the remaining hosts for this loop 30582 1726855278.33270: getting the next task for host managed_node3 30582 1726855278.33281: done getting next task for host managed_node3 30582 1726855278.33284: ^ task is: TASK: Conditional asserts 30582 1726855278.33289: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855278.33294: getting variables 30582 1726855278.33296: in VariableManager get_vars() 30582 1726855278.33329: Calling all_inventory to load vars for managed_node3 30582 1726855278.33332: Calling groups_inventory to load vars for managed_node3 30582 1726855278.33336: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.33348: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.33350: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.33353: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.35970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.39434: done with get_vars() 30582 1726855278.39468: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 14:01:18 -0400 (0:00:00.108) 0:00:14.745 ****** 30582 1726855278.39572: entering _queue_task() for managed_node3/include_tasks 30582 1726855278.40447: worker is 1 (out of 1 available) 30582 1726855278.40463: exiting _queue_task() for managed_node3/include_tasks 30582 1726855278.40475: done queuing things up, now waiting for results queue to drain 30582 1726855278.40476: waiting for pending results... 30582 1726855278.41026: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30582 1726855278.41229: in run() - task 0affcc66-ac2b-aa83-7d57-000000000097 30582 1726855278.41331: variable 'ansible_search_path' from source: unknown 30582 1726855278.41336: variable 'ansible_search_path' from source: unknown 30582 1726855278.41911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855278.46365: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855278.46594: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855278.46659: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855278.46779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855278.46817: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855278.47034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855278.47294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855278.47297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855278.47301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855278.47303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855278.47592: variable 'lsr_assert_when' from source: include params 30582 1726855278.48115: variable 'network_provider' from source: set_fact 30582 1726855278.48360: variable 'omit' from source: magic vars 30582 1726855278.48827: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.48842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.48856: variable 'omit' from source: magic vars 30582 1726855278.49292: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.49307: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.49428: variable 'item' from source: unknown 30582 1726855278.49601: Evaluated conditional (item['condition']): True 30582 1726855278.49686: variable 'item' from source: unknown 30582 1726855278.49726: variable 'item' from source: unknown 30582 1726855278.49955: variable 'item' from source: unknown 30582 1726855278.50192: dumping result to json 30582 1726855278.50196: done dumping result, returning 30582 1726855278.50198: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcc66-ac2b-aa83-7d57-000000000097] 30582 1726855278.50200: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000097 30582 1726855278.50248: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000097 30582 1726855278.50252: WORKER PROCESS EXITING 30582 1726855278.50280: no more pending results, returning what we have 30582 1726855278.50286: in VariableManager get_vars() 30582 1726855278.50326: Calling all_inventory to load vars for managed_node3 30582 1726855278.50329: Calling groups_inventory to load vars for managed_node3 30582 1726855278.50332: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.50344: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.50347: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.50350: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.52867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.56331: done with get_vars() 30582 1726855278.56352: variable 'ansible_search_path' from source: unknown 30582 1726855278.56354: variable 'ansible_search_path' from source: unknown 30582 1726855278.56397: we have included files to process 30582 1726855278.56399: generating all_blocks data 30582 1726855278.56400: done generating all_blocks data 30582 1726855278.56405: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30582 1726855278.56406: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30582 1726855278.56409: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30582 1726855278.56673: in VariableManager get_vars() 30582 1726855278.56696: done with get_vars() 30582 1726855278.57008: done processing included file 30582 1726855278.57011: iterating over new_blocks loaded from include file 30582 1726855278.57012: in VariableManager get_vars() 30582 1726855278.57027: done with get_vars() 30582 1726855278.57028: filtering new block on tags 30582 1726855278.57062: done filtering new block on tags 30582 1726855278.57065: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 => (item={'what': 'tasks/assert_device_present.yml', 'condition': True}) 30582 1726855278.57070: extending task lists for all hosts with included blocks 30582 1726855278.58564: done extending task lists 30582 1726855278.58566: done processing included files 30582 1726855278.58567: results queue empty 30582 1726855278.58568: checking for any_errors_fatal 30582 1726855278.58571: done checking for any_errors_fatal 30582 1726855278.58572: checking for max_fail_percentage 30582 1726855278.58573: done checking for max_fail_percentage 30582 1726855278.58574: checking to see if all hosts have failed and the running result is not ok 30582 1726855278.58575: done checking to see if all hosts have failed 30582 1726855278.58575: getting the remaining hosts for this loop 30582 1726855278.58577: done getting the remaining hosts for this loop 30582 1726855278.58579: getting the next task for host managed_node3 30582 1726855278.58584: done getting next task for host managed_node3 30582 1726855278.58586: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30582 1726855278.58590: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855278.58600: getting variables 30582 1726855278.58601: in VariableManager get_vars() 30582 1726855278.58614: Calling all_inventory to load vars for managed_node3 30582 1726855278.58617: Calling groups_inventory to load vars for managed_node3 30582 1726855278.58619: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.58626: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.58628: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.58631: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.59863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.61580: done with get_vars() 30582 1726855278.61811: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 14:01:18 -0400 (0:00:00.223) 0:00:14.968 ****** 30582 1726855278.61900: entering _queue_task() for managed_node3/include_tasks 30582 1726855278.62666: worker is 1 (out of 1 available) 30582 1726855278.62681: exiting _queue_task() for managed_node3/include_tasks 30582 1726855278.62695: done queuing things up, now waiting for results queue to drain 30582 1726855278.62697: waiting for pending results... 30582 1726855278.63233: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30582 1726855278.63748: in run() - task 0affcc66-ac2b-aa83-7d57-000000000452 30582 1726855278.63752: variable 'ansible_search_path' from source: unknown 30582 1726855278.63755: variable 'ansible_search_path' from source: unknown 30582 1726855278.63757: calling self._execute() 30582 1726855278.63823: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.63833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.63847: variable 'omit' from source: magic vars 30582 1726855278.64670: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.64781: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.64789: _execute() done 30582 1726855278.64792: dumping result to json 30582 1726855278.64794: done dumping result, returning 30582 1726855278.64797: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-aa83-7d57-000000000452] 30582 1726855278.64799: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000452 30582 1726855278.65209: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000452 30582 1726855278.65214: WORKER PROCESS EXITING 30582 1726855278.65246: no more pending results, returning what we have 30582 1726855278.65252: in VariableManager get_vars() 30582 1726855278.65296: Calling all_inventory to load vars for managed_node3 30582 1726855278.65299: Calling groups_inventory to load vars for managed_node3 30582 1726855278.65303: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.65319: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.65323: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.65326: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.68245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.72055: done with get_vars() 30582 1726855278.72084: variable 'ansible_search_path' from source: unknown 30582 1726855278.72085: variable 'ansible_search_path' from source: unknown 30582 1726855278.72458: variable 'item' from source: include params 30582 1726855278.72903: we have included files to process 30582 1726855278.72904: generating all_blocks data 30582 1726855278.72906: done generating all_blocks data 30582 1726855278.72908: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855278.72909: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855278.72911: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855278.73527: done processing included file 30582 1726855278.73529: iterating over new_blocks loaded from include file 30582 1726855278.73530: in VariableManager get_vars() 30582 1726855278.73548: done with get_vars() 30582 1726855278.73551: filtering new block on tags 30582 1726855278.73579: done filtering new block on tags 30582 1726855278.73582: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30582 1726855278.73590: extending task lists for all hosts with included blocks 30582 1726855278.74184: done extending task lists 30582 1726855278.74186: done processing included files 30582 1726855278.74186: results queue empty 30582 1726855278.74189: checking for any_errors_fatal 30582 1726855278.74192: done checking for any_errors_fatal 30582 1726855278.74193: checking for max_fail_percentage 30582 1726855278.74194: done checking for max_fail_percentage 30582 1726855278.74195: checking to see if all hosts have failed and the running result is not ok 30582 1726855278.74196: done checking to see if all hosts have failed 30582 1726855278.74197: getting the remaining hosts for this loop 30582 1726855278.74198: done getting the remaining hosts for this loop 30582 1726855278.74200: getting the next task for host managed_node3 30582 1726855278.74205: done getting next task for host managed_node3 30582 1726855278.74207: ^ task is: TASK: Get stat for interface {{ interface }} 30582 1726855278.74210: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855278.74212: getting variables 30582 1726855278.74213: in VariableManager get_vars() 30582 1726855278.74223: Calling all_inventory to load vars for managed_node3 30582 1726855278.74225: Calling groups_inventory to load vars for managed_node3 30582 1726855278.74227: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855278.74232: Calling all_plugins_play to load vars for managed_node3 30582 1726855278.74235: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855278.74238: Calling groups_plugins_play to load vars for managed_node3 30582 1726855278.76726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855278.78280: done with get_vars() 30582 1726855278.78311: done getting variables 30582 1726855278.78432: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 14:01:18 -0400 (0:00:00.165) 0:00:15.134 ****** 30582 1726855278.78462: entering _queue_task() for managed_node3/stat 30582 1726855278.79132: worker is 1 (out of 1 available) 30582 1726855278.79147: exiting _queue_task() for managed_node3/stat 30582 1726855278.79159: done queuing things up, now waiting for results queue to drain 30582 1726855278.79160: waiting for pending results... 30582 1726855278.79707: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30582 1726855278.79712: in run() - task 0affcc66-ac2b-aa83-7d57-0000000004e8 30582 1726855278.79715: variable 'ansible_search_path' from source: unknown 30582 1726855278.79719: variable 'ansible_search_path' from source: unknown 30582 1726855278.79723: calling self._execute() 30582 1726855278.79727: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.79730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.79734: variable 'omit' from source: magic vars 30582 1726855278.80239: variable 'ansible_distribution_major_version' from source: facts 30582 1726855278.80242: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855278.80246: variable 'omit' from source: magic vars 30582 1726855278.80248: variable 'omit' from source: magic vars 30582 1726855278.80374: variable 'interface' from source: play vars 30582 1726855278.80415: variable 'omit' from source: magic vars 30582 1726855278.80470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855278.80521: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855278.80552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855278.80570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855278.80582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855278.80614: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855278.80617: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.80620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.80731: Set connection var ansible_timeout to 10 30582 1726855278.80735: Set connection var ansible_connection to ssh 30582 1726855278.80741: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855278.80746: Set connection var ansible_pipelining to False 30582 1726855278.80756: Set connection var ansible_shell_executable to /bin/sh 30582 1726855278.80759: Set connection var ansible_shell_type to sh 30582 1726855278.80781: variable 'ansible_shell_executable' from source: unknown 30582 1726855278.80789: variable 'ansible_connection' from source: unknown 30582 1726855278.80792: variable 'ansible_module_compression' from source: unknown 30582 1726855278.80795: variable 'ansible_shell_type' from source: unknown 30582 1726855278.80797: variable 'ansible_shell_executable' from source: unknown 30582 1726855278.80800: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855278.80802: variable 'ansible_pipelining' from source: unknown 30582 1726855278.80805: variable 'ansible_timeout' from source: unknown 30582 1726855278.80807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855278.81109: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855278.81114: variable 'omit' from source: magic vars 30582 1726855278.81116: starting attempt loop 30582 1726855278.81118: running the handler 30582 1726855278.81120: _low_level_execute_command(): starting 30582 1726855278.81122: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855278.81808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855278.81853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855278.81878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855278.81976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855278.83801: stdout chunk (state=3): >>>/root <<< 30582 1726855278.83849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855278.83853: stdout chunk (state=3): >>><<< 30582 1726855278.83859: stderr chunk (state=3): >>><<< 30582 1726855278.83958: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855278.83962: _low_level_execute_command(): starting 30582 1726855278.83965: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948 `" && echo ansible-tmp-1726855278.838847-31254-265043462942948="` echo /root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948 `" ) && sleep 0' 30582 1726855278.84664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855278.84672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855278.84689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855278.84703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855278.84719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855278.84722: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855278.84733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855278.84748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855278.84755: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855278.84762: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855278.84771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855278.84862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855278.84866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855278.84982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855278.86902: stdout chunk (state=3): >>>ansible-tmp-1726855278.838847-31254-265043462942948=/root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948 <<< 30582 1726855278.87060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855278.87107: stderr chunk (state=3): >>><<< 30582 1726855278.87111: stdout chunk (state=3): >>><<< 30582 1726855278.87114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855278.838847-31254-265043462942948=/root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855278.87276: variable 'ansible_module_compression' from source: unknown 30582 1726855278.87279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855278.87281: variable 'ansible_facts' from source: unknown 30582 1726855278.87353: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/AnsiballZ_stat.py 30582 1726855278.87526: Sending initial data 30582 1726855278.87529: Sent initial data (152 bytes) 30582 1726855278.88104: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855278.88113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855278.88124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855278.88140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855278.88168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855278.88262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855278.88266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855278.88269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855278.88391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855278.89997: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855278.90086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855278.90169: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpa4mg1s39 /root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/AnsiballZ_stat.py <<< 30582 1726855278.90172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/AnsiballZ_stat.py" <<< 30582 1726855278.90227: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpa4mg1s39" to remote "/root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/AnsiballZ_stat.py" <<< 30582 1726855278.90946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855278.91051: stderr chunk (state=3): >>><<< 30582 1726855278.91054: stdout chunk (state=3): >>><<< 30582 1726855278.91056: done transferring module to remote 30582 1726855278.91058: _low_level_execute_command(): starting 30582 1726855278.91060: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/ /root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/AnsiballZ_stat.py && sleep 0' 30582 1726855278.91683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855278.91698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855278.91708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855278.91722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855278.91736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855278.91741: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855278.91757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855278.91801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855278.91868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855278.91875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855278.91920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855278.91996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855278.93819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855278.93861: stderr chunk (state=3): >>><<< 30582 1726855278.93864: stdout chunk (state=3): >>><<< 30582 1726855278.93874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855278.93877: _low_level_execute_command(): starting 30582 1726855278.93882: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/AnsiballZ_stat.py && sleep 0' 30582 1726855278.94557: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855278.94575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855278.94660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855279.09851: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32242, "dev": 23, "nlink": 1, "atime": 1726855276.0737445, "mtime": 1726855276.0737445, "ctime": 1726855276.0737445, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855279.11179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855279.11207: stderr chunk (state=3): >>><<< 30582 1726855279.11211: stdout chunk (state=3): >>><<< 30582 1726855279.11231: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32242, "dev": 23, "nlink": 1, "atime": 1726855276.0737445, "mtime": 1726855276.0737445, "ctime": 1726855276.0737445, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855279.11267: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855279.11275: _low_level_execute_command(): starting 30582 1726855279.11281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855278.838847-31254-265043462942948/ > /dev/null 2>&1 && sleep 0' 30582 1726855279.11744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855279.11747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855279.11750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855279.11752: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855279.11754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855279.11811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855279.11814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855279.11820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855279.11885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855279.13722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855279.13748: stderr chunk (state=3): >>><<< 30582 1726855279.13751: stdout chunk (state=3): >>><<< 30582 1726855279.13766: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855279.13772: handler run complete 30582 1726855279.13809: attempt loop complete, returning result 30582 1726855279.13812: _execute() done 30582 1726855279.13814: dumping result to json 30582 1726855279.13819: done dumping result, returning 30582 1726855279.13827: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcc66-ac2b-aa83-7d57-0000000004e8] 30582 1726855279.13831: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000004e8 30582 1726855279.13941: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000004e8 30582 1726855279.13943: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726855276.0737445, "block_size": 4096, "blocks": 0, "ctime": 1726855276.0737445, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32242, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726855276.0737445, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30582 1726855279.14063: no more pending results, returning what we have 30582 1726855279.14067: results queue empty 30582 1726855279.14068: checking for any_errors_fatal 30582 1726855279.14069: done checking for any_errors_fatal 30582 1726855279.14070: checking for max_fail_percentage 30582 1726855279.14071: done checking for max_fail_percentage 30582 1726855279.14072: checking to see if all hosts have failed and the running result is not ok 30582 1726855279.14073: done checking to see if all hosts have failed 30582 1726855279.14074: getting the remaining hosts for this loop 30582 1726855279.14075: done getting the remaining hosts for this loop 30582 1726855279.14079: getting the next task for host managed_node3 30582 1726855279.14088: done getting next task for host managed_node3 30582 1726855279.14091: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30582 1726855279.14094: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855279.14098: getting variables 30582 1726855279.14099: in VariableManager get_vars() 30582 1726855279.14126: Calling all_inventory to load vars for managed_node3 30582 1726855279.14128: Calling groups_inventory to load vars for managed_node3 30582 1726855279.14131: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855279.14140: Calling all_plugins_play to load vars for managed_node3 30582 1726855279.14142: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855279.14145: Calling groups_plugins_play to load vars for managed_node3 30582 1726855279.15523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855279.16383: done with get_vars() 30582 1726855279.16404: done getting variables 30582 1726855279.16449: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855279.16537: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 14:01:19 -0400 (0:00:00.381) 0:00:15.515 ****** 30582 1726855279.16564: entering _queue_task() for managed_node3/assert 30582 1726855279.16827: worker is 1 (out of 1 available) 30582 1726855279.16841: exiting _queue_task() for managed_node3/assert 30582 1726855279.16853: done queuing things up, now waiting for results queue to drain 30582 1726855279.16855: waiting for pending results... 30582 1726855279.17046: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' 30582 1726855279.17120: in run() - task 0affcc66-ac2b-aa83-7d57-000000000453 30582 1726855279.17132: variable 'ansible_search_path' from source: unknown 30582 1726855279.17135: variable 'ansible_search_path' from source: unknown 30582 1726855279.17200: calling self._execute() 30582 1726855279.17265: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.17268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.17365: variable 'omit' from source: magic vars 30582 1726855279.17898: variable 'ansible_distribution_major_version' from source: facts 30582 1726855279.17902: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855279.17906: variable 'omit' from source: magic vars 30582 1726855279.17908: variable 'omit' from source: magic vars 30582 1726855279.17911: variable 'interface' from source: play vars 30582 1726855279.17919: variable 'omit' from source: magic vars 30582 1726855279.17961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855279.18000: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855279.18020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855279.18036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855279.18049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855279.18079: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855279.18083: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.18085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.18186: Set connection var ansible_timeout to 10 30582 1726855279.18191: Set connection var ansible_connection to ssh 30582 1726855279.18210: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855279.18213: Set connection var ansible_pipelining to False 30582 1726855279.18215: Set connection var ansible_shell_executable to /bin/sh 30582 1726855279.18222: Set connection var ansible_shell_type to sh 30582 1726855279.18235: variable 'ansible_shell_executable' from source: unknown 30582 1726855279.18238: variable 'ansible_connection' from source: unknown 30582 1726855279.18241: variable 'ansible_module_compression' from source: unknown 30582 1726855279.18243: variable 'ansible_shell_type' from source: unknown 30582 1726855279.18245: variable 'ansible_shell_executable' from source: unknown 30582 1726855279.18248: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.18336: variable 'ansible_pipelining' from source: unknown 30582 1726855279.18339: variable 'ansible_timeout' from source: unknown 30582 1726855279.18342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.18398: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855279.18408: variable 'omit' from source: magic vars 30582 1726855279.18414: starting attempt loop 30582 1726855279.18417: running the handler 30582 1726855279.18553: variable 'interface_stat' from source: set_fact 30582 1726855279.18567: Evaluated conditional (interface_stat.stat.exists): True 30582 1726855279.18572: handler run complete 30582 1726855279.18590: attempt loop complete, returning result 30582 1726855279.18593: _execute() done 30582 1726855279.18596: dumping result to json 30582 1726855279.18598: done dumping result, returning 30582 1726855279.18606: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' [0affcc66-ac2b-aa83-7d57-000000000453] 30582 1726855279.18611: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000453 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855279.18853: no more pending results, returning what we have 30582 1726855279.18856: results queue empty 30582 1726855279.18857: checking for any_errors_fatal 30582 1726855279.18864: done checking for any_errors_fatal 30582 1726855279.18865: checking for max_fail_percentage 30582 1726855279.18866: done checking for max_fail_percentage 30582 1726855279.18867: checking to see if all hosts have failed and the running result is not ok 30582 1726855279.18868: done checking to see if all hosts have failed 30582 1726855279.18868: getting the remaining hosts for this loop 30582 1726855279.18870: done getting the remaining hosts for this loop 30582 1726855279.18875: getting the next task for host managed_node3 30582 1726855279.18882: done getting next task for host managed_node3 30582 1726855279.18885: ^ task is: TASK: Success in test '{{ lsr_description }}' 30582 1726855279.18889: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855279.18893: getting variables 30582 1726855279.18894: in VariableManager get_vars() 30582 1726855279.18922: Calling all_inventory to load vars for managed_node3 30582 1726855279.18924: Calling groups_inventory to load vars for managed_node3 30582 1726855279.18927: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855279.18936: Calling all_plugins_play to load vars for managed_node3 30582 1726855279.18938: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855279.18941: Calling groups_plugins_play to load vars for managed_node3 30582 1726855279.19486: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000453 30582 1726855279.19492: WORKER PROCESS EXITING 30582 1726855279.20292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855279.26768: done with get_vars() 30582 1726855279.26800: done getting variables 30582 1726855279.26849: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855279.26950: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile'] ******************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 14:01:19 -0400 (0:00:00.104) 0:00:15.619 ****** 30582 1726855279.26976: entering _queue_task() for managed_node3/debug 30582 1726855279.27330: worker is 1 (out of 1 available) 30582 1726855279.27343: exiting _queue_task() for managed_node3/debug 30582 1726855279.27354: done queuing things up, now waiting for results queue to drain 30582 1726855279.27356: waiting for pending results... 30582 1726855279.27653: running TaskExecutor() for managed_node3/TASK: Success in test 'I can create a profile' 30582 1726855279.27777: in run() - task 0affcc66-ac2b-aa83-7d57-000000000098 30582 1726855279.27800: variable 'ansible_search_path' from source: unknown 30582 1726855279.27815: variable 'ansible_search_path' from source: unknown 30582 1726855279.27853: calling self._execute() 30582 1726855279.27994: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.27998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.28001: variable 'omit' from source: magic vars 30582 1726855279.28367: variable 'ansible_distribution_major_version' from source: facts 30582 1726855279.28382: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855279.28395: variable 'omit' from source: magic vars 30582 1726855279.28431: variable 'omit' from source: magic vars 30582 1726855279.28578: variable 'lsr_description' from source: include params 30582 1726855279.28581: variable 'omit' from source: magic vars 30582 1726855279.28624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855279.28667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855279.28706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855279.28794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855279.28798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855279.28801: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855279.28803: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.28805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.28911: Set connection var ansible_timeout to 10 30582 1726855279.28924: Set connection var ansible_connection to ssh 30582 1726855279.28936: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855279.28948: Set connection var ansible_pipelining to False 30582 1726855279.28958: Set connection var ansible_shell_executable to /bin/sh 30582 1726855279.28965: Set connection var ansible_shell_type to sh 30582 1726855279.29010: variable 'ansible_shell_executable' from source: unknown 30582 1726855279.29014: variable 'ansible_connection' from source: unknown 30582 1726855279.29017: variable 'ansible_module_compression' from source: unknown 30582 1726855279.29019: variable 'ansible_shell_type' from source: unknown 30582 1726855279.29021: variable 'ansible_shell_executable' from source: unknown 30582 1726855279.29192: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.29195: variable 'ansible_pipelining' from source: unknown 30582 1726855279.29198: variable 'ansible_timeout' from source: unknown 30582 1726855279.29200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.29204: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855279.29221: variable 'omit' from source: magic vars 30582 1726855279.29233: starting attempt loop 30582 1726855279.29240: running the handler 30582 1726855279.29294: handler run complete 30582 1726855279.29320: attempt loop complete, returning result 30582 1726855279.29332: _execute() done 30582 1726855279.29339: dumping result to json 30582 1726855279.29347: done dumping result, returning 30582 1726855279.29359: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can create a profile' [0affcc66-ac2b-aa83-7d57-000000000098] 30582 1726855279.29369: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000098 30582 1726855279.29793: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000098 30582 1726855279.29796: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I can create a profile' +++++ 30582 1726855279.29841: no more pending results, returning what we have 30582 1726855279.29844: results queue empty 30582 1726855279.29845: checking for any_errors_fatal 30582 1726855279.29852: done checking for any_errors_fatal 30582 1726855279.29853: checking for max_fail_percentage 30582 1726855279.29854: done checking for max_fail_percentage 30582 1726855279.29855: checking to see if all hosts have failed and the running result is not ok 30582 1726855279.29856: done checking to see if all hosts have failed 30582 1726855279.29857: getting the remaining hosts for this loop 30582 1726855279.29858: done getting the remaining hosts for this loop 30582 1726855279.29862: getting the next task for host managed_node3 30582 1726855279.29869: done getting next task for host managed_node3 30582 1726855279.29873: ^ task is: TASK: Cleanup 30582 1726855279.29876: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855279.29880: getting variables 30582 1726855279.29882: in VariableManager get_vars() 30582 1726855279.29916: Calling all_inventory to load vars for managed_node3 30582 1726855279.29919: Calling groups_inventory to load vars for managed_node3 30582 1726855279.29922: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855279.29932: Calling all_plugins_play to load vars for managed_node3 30582 1726855279.29935: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855279.29938: Calling groups_plugins_play to load vars for managed_node3 30582 1726855279.31315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855279.32578: done with get_vars() 30582 1726855279.32600: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 14:01:19 -0400 (0:00:00.056) 0:00:15.676 ****** 30582 1726855279.32669: entering _queue_task() for managed_node3/include_tasks 30582 1726855279.32916: worker is 1 (out of 1 available) 30582 1726855279.32930: exiting _queue_task() for managed_node3/include_tasks 30582 1726855279.32942: done queuing things up, now waiting for results queue to drain 30582 1726855279.32943: waiting for pending results... 30582 1726855279.33133: running TaskExecutor() for managed_node3/TASK: Cleanup 30582 1726855279.33207: in run() - task 0affcc66-ac2b-aa83-7d57-00000000009c 30582 1726855279.33217: variable 'ansible_search_path' from source: unknown 30582 1726855279.33221: variable 'ansible_search_path' from source: unknown 30582 1726855279.33256: variable 'lsr_cleanup' from source: include params 30582 1726855279.33417: variable 'lsr_cleanup' from source: include params 30582 1726855279.33468: variable 'omit' from source: magic vars 30582 1726855279.33562: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.33569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.33577: variable 'omit' from source: magic vars 30582 1726855279.33751: variable 'ansible_distribution_major_version' from source: facts 30582 1726855279.33759: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855279.33765: variable 'item' from source: unknown 30582 1726855279.33815: variable 'item' from source: unknown 30582 1726855279.33857: variable 'item' from source: unknown 30582 1726855279.33899: variable 'item' from source: unknown 30582 1726855279.34025: dumping result to json 30582 1726855279.34028: done dumping result, returning 30582 1726855279.34030: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcc66-ac2b-aa83-7d57-00000000009c] 30582 1726855279.34032: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000009c 30582 1726855279.34065: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000009c 30582 1726855279.34067: WORKER PROCESS EXITING 30582 1726855279.34095: no more pending results, returning what we have 30582 1726855279.34101: in VariableManager get_vars() 30582 1726855279.34135: Calling all_inventory to load vars for managed_node3 30582 1726855279.34138: Calling groups_inventory to load vars for managed_node3 30582 1726855279.34141: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855279.34153: Calling all_plugins_play to load vars for managed_node3 30582 1726855279.34156: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855279.34158: Calling groups_plugins_play to load vars for managed_node3 30582 1726855279.35528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855279.36382: done with get_vars() 30582 1726855279.36399: variable 'ansible_search_path' from source: unknown 30582 1726855279.36400: variable 'ansible_search_path' from source: unknown 30582 1726855279.36428: we have included files to process 30582 1726855279.36429: generating all_blocks data 30582 1726855279.36430: done generating all_blocks data 30582 1726855279.36432: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855279.36433: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855279.36434: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855279.36608: done processing included file 30582 1726855279.36610: iterating over new_blocks loaded from include file 30582 1726855279.36611: in VariableManager get_vars() 30582 1726855279.36621: done with get_vars() 30582 1726855279.36623: filtering new block on tags 30582 1726855279.36638: done filtering new block on tags 30582 1726855279.36639: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30582 1726855279.36643: extending task lists for all hosts with included blocks 30582 1726855279.37625: done extending task lists 30582 1726855279.37627: done processing included files 30582 1726855279.37627: results queue empty 30582 1726855279.37628: checking for any_errors_fatal 30582 1726855279.37631: done checking for any_errors_fatal 30582 1726855279.37632: checking for max_fail_percentage 30582 1726855279.37633: done checking for max_fail_percentage 30582 1726855279.37634: checking to see if all hosts have failed and the running result is not ok 30582 1726855279.37635: done checking to see if all hosts have failed 30582 1726855279.37636: getting the remaining hosts for this loop 30582 1726855279.37637: done getting the remaining hosts for this loop 30582 1726855279.37640: getting the next task for host managed_node3 30582 1726855279.37645: done getting next task for host managed_node3 30582 1726855279.37647: ^ task is: TASK: Cleanup profile and device 30582 1726855279.37649: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855279.37651: getting variables 30582 1726855279.37652: in VariableManager get_vars() 30582 1726855279.37662: Calling all_inventory to load vars for managed_node3 30582 1726855279.37665: Calling groups_inventory to load vars for managed_node3 30582 1726855279.37667: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855279.37672: Calling all_plugins_play to load vars for managed_node3 30582 1726855279.37675: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855279.37677: Calling groups_plugins_play to load vars for managed_node3 30582 1726855279.38781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855279.39642: done with get_vars() 30582 1726855279.39656: done getting variables 30582 1726855279.39695: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 14:01:19 -0400 (0:00:00.070) 0:00:15.747 ****** 30582 1726855279.39717: entering _queue_task() for managed_node3/shell 30582 1726855279.39969: worker is 1 (out of 1 available) 30582 1726855279.39986: exiting _queue_task() for managed_node3/shell 30582 1726855279.39998: done queuing things up, now waiting for results queue to drain 30582 1726855279.40000: waiting for pending results... 30582 1726855279.40173: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30582 1726855279.40276: in run() - task 0affcc66-ac2b-aa83-7d57-00000000050b 30582 1726855279.40317: variable 'ansible_search_path' from source: unknown 30582 1726855279.40321: variable 'ansible_search_path' from source: unknown 30582 1726855279.40333: calling self._execute() 30582 1726855279.40419: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.40424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.40433: variable 'omit' from source: magic vars 30582 1726855279.40791: variable 'ansible_distribution_major_version' from source: facts 30582 1726855279.41000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855279.41003: variable 'omit' from source: magic vars 30582 1726855279.41006: variable 'omit' from source: magic vars 30582 1726855279.41009: variable 'interface' from source: play vars 30582 1726855279.41011: variable 'omit' from source: magic vars 30582 1726855279.41060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855279.41109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855279.41119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855279.41134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855279.41149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855279.41184: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855279.41202: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.41214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.41276: Set connection var ansible_timeout to 10 30582 1726855279.41280: Set connection var ansible_connection to ssh 30582 1726855279.41286: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855279.41293: Set connection var ansible_pipelining to False 30582 1726855279.41298: Set connection var ansible_shell_executable to /bin/sh 30582 1726855279.41301: Set connection var ansible_shell_type to sh 30582 1726855279.41320: variable 'ansible_shell_executable' from source: unknown 30582 1726855279.41325: variable 'ansible_connection' from source: unknown 30582 1726855279.41327: variable 'ansible_module_compression' from source: unknown 30582 1726855279.41331: variable 'ansible_shell_type' from source: unknown 30582 1726855279.41333: variable 'ansible_shell_executable' from source: unknown 30582 1726855279.41335: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.41340: variable 'ansible_pipelining' from source: unknown 30582 1726855279.41342: variable 'ansible_timeout' from source: unknown 30582 1726855279.41347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.41486: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855279.41506: variable 'omit' from source: magic vars 30582 1726855279.41508: starting attempt loop 30582 1726855279.41511: running the handler 30582 1726855279.41514: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855279.41566: _low_level_execute_command(): starting 30582 1726855279.41570: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855279.42246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855279.42264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855279.42364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855279.44055: stdout chunk (state=3): >>>/root <<< 30582 1726855279.44151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855279.44184: stderr chunk (state=3): >>><<< 30582 1726855279.44191: stdout chunk (state=3): >>><<< 30582 1726855279.44211: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855279.44222: _low_level_execute_command(): starting 30582 1726855279.44229: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837 `" && echo ansible-tmp-1726855279.4421124-31283-153039904408837="` echo /root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837 `" ) && sleep 0' 30582 1726855279.44980: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855279.44998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855279.45033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855279.45115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855279.47004: stdout chunk (state=3): >>>ansible-tmp-1726855279.4421124-31283-153039904408837=/root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837 <<< 30582 1726855279.47108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855279.47137: stderr chunk (state=3): >>><<< 30582 1726855279.47140: stdout chunk (state=3): >>><<< 30582 1726855279.47158: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855279.4421124-31283-153039904408837=/root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855279.47193: variable 'ansible_module_compression' from source: unknown 30582 1726855279.47234: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855279.47264: variable 'ansible_facts' from source: unknown 30582 1726855279.47326: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/AnsiballZ_command.py 30582 1726855279.47432: Sending initial data 30582 1726855279.47436: Sent initial data (156 bytes) 30582 1726855279.47864: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855279.47891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855279.47895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855279.47898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855279.47900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855279.47911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855279.47963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855279.47966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855279.47969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855279.48034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855279.49619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855279.49675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855279.49732: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppdshqu_d /root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/AnsiballZ_command.py <<< 30582 1726855279.49738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/AnsiballZ_command.py" <<< 30582 1726855279.49791: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppdshqu_d" to remote "/root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/AnsiballZ_command.py" <<< 30582 1726855279.49795: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/AnsiballZ_command.py" <<< 30582 1726855279.50422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855279.50466: stderr chunk (state=3): >>><<< 30582 1726855279.50470: stdout chunk (state=3): >>><<< 30582 1726855279.50500: done transferring module to remote 30582 1726855279.50512: _low_level_execute_command(): starting 30582 1726855279.50516: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/ /root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/AnsiballZ_command.py && sleep 0' 30582 1726855279.50974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855279.50977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855279.50980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855279.50982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855279.50984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855279.50986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855279.51037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855279.51040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855279.51110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855279.52869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855279.52898: stderr chunk (state=3): >>><<< 30582 1726855279.52901: stdout chunk (state=3): >>><<< 30582 1726855279.52914: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855279.52917: _low_level_execute_command(): starting 30582 1726855279.52924: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/AnsiballZ_command.py && sleep 0' 30582 1726855279.53376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855279.53379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855279.53381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855279.53386: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855279.53392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855279.53433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855279.53437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855279.53442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855279.53507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855279.74424: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (9fc70a3d-08d2-4d99-b645-a6e60c4199d8) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:01:19.684592", "end": "2024-09-20 14:01:19.741644", "delta": "0:00:00.057052", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855279.76195: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. <<< 30582 1726855279.76199: stdout chunk (state=3): >>><<< 30582 1726855279.76202: stderr chunk (state=3): >>><<< 30582 1726855279.76204: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (9fc70a3d-08d2-4d99-b645-a6e60c4199d8) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:01:19.684592", "end": "2024-09-20 14:01:19.741644", "delta": "0:00:00.057052", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855279.76208: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855279.76212: _low_level_execute_command(): starting 30582 1726855279.76214: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855279.4421124-31283-153039904408837/ > /dev/null 2>&1 && sleep 0' 30582 1726855279.77220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855279.77474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855279.77622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855279.77679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855279.79554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855279.79569: stdout chunk (state=3): >>><<< 30582 1726855279.79596: stderr chunk (state=3): >>><<< 30582 1726855279.79624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855279.79637: handler run complete 30582 1726855279.79663: Evaluated conditional (False): False 30582 1726855279.79678: attempt loop complete, returning result 30582 1726855279.79699: _execute() done 30582 1726855279.79707: dumping result to json 30582 1726855279.79717: done dumping result, returning 30582 1726855279.79729: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcc66-ac2b-aa83-7d57-00000000050b] 30582 1726855279.79738: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000050b fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.057052", "end": "2024-09-20 14:01:19.741644", "rc": 1, "start": "2024-09-20 14:01:19.684592" } STDOUT: Connection 'statebr' (9fc70a3d-08d2-4d99-b645-a6e60c4199d8) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30582 1726855279.79936: no more pending results, returning what we have 30582 1726855279.79941: results queue empty 30582 1726855279.79942: checking for any_errors_fatal 30582 1726855279.79944: done checking for any_errors_fatal 30582 1726855279.79945: checking for max_fail_percentage 30582 1726855279.79947: done checking for max_fail_percentage 30582 1726855279.79948: checking to see if all hosts have failed and the running result is not ok 30582 1726855279.79948: done checking to see if all hosts have failed 30582 1726855279.79949: getting the remaining hosts for this loop 30582 1726855279.79951: done getting the remaining hosts for this loop 30582 1726855279.79955: getting the next task for host managed_node3 30582 1726855279.80198: done getting next task for host managed_node3 30582 1726855279.80202: ^ task is: TASK: Include the task 'run_test.yml' 30582 1726855279.80204: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855279.80208: getting variables 30582 1726855279.80210: in VariableManager get_vars() 30582 1726855279.80241: Calling all_inventory to load vars for managed_node3 30582 1726855279.80244: Calling groups_inventory to load vars for managed_node3 30582 1726855279.80248: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855279.80260: Calling all_plugins_play to load vars for managed_node3 30582 1726855279.80263: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855279.80265: Calling groups_plugins_play to load vars for managed_node3 30582 1726855279.80913: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000050b 30582 1726855279.80916: WORKER PROCESS EXITING 30582 1726855279.81954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855279.83534: done with get_vars() 30582 1726855279.83560: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:45 Friday 20 September 2024 14:01:19 -0400 (0:00:00.439) 0:00:16.186 ****** 30582 1726855279.83658: entering _queue_task() for managed_node3/include_tasks 30582 1726855279.84016: worker is 1 (out of 1 available) 30582 1726855279.84030: exiting _queue_task() for managed_node3/include_tasks 30582 1726855279.84041: done queuing things up, now waiting for results queue to drain 30582 1726855279.84043: waiting for pending results... 30582 1726855279.84414: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30582 1726855279.84424: in run() - task 0affcc66-ac2b-aa83-7d57-00000000000f 30582 1726855279.84441: variable 'ansible_search_path' from source: unknown 30582 1726855279.84485: calling self._execute() 30582 1726855279.84585: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.84600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.84620: variable 'omit' from source: magic vars 30582 1726855279.85028: variable 'ansible_distribution_major_version' from source: facts 30582 1726855279.85052: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855279.85067: _execute() done 30582 1726855279.85077: dumping result to json 30582 1726855279.85086: done dumping result, returning 30582 1726855279.85292: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcc66-ac2b-aa83-7d57-00000000000f] 30582 1726855279.85297: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000000f 30582 1726855279.85382: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000000f 30582 1726855279.85385: WORKER PROCESS EXITING 30582 1726855279.85417: no more pending results, returning what we have 30582 1726855279.85423: in VariableManager get_vars() 30582 1726855279.85465: Calling all_inventory to load vars for managed_node3 30582 1726855279.85469: Calling groups_inventory to load vars for managed_node3 30582 1726855279.85474: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855279.85493: Calling all_plugins_play to load vars for managed_node3 30582 1726855279.85497: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855279.85501: Calling groups_plugins_play to load vars for managed_node3 30582 1726855279.86981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855279.88668: done with get_vars() 30582 1726855279.88691: variable 'ansible_search_path' from source: unknown 30582 1726855279.88708: we have included files to process 30582 1726855279.88709: generating all_blocks data 30582 1726855279.88711: done generating all_blocks data 30582 1726855279.88715: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855279.88716: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855279.88719: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855279.89117: in VariableManager get_vars() 30582 1726855279.89135: done with get_vars() 30582 1726855279.89174: in VariableManager get_vars() 30582 1726855279.89191: done with get_vars() 30582 1726855279.89231: in VariableManager get_vars() 30582 1726855279.89246: done with get_vars() 30582 1726855279.89283: in VariableManager get_vars() 30582 1726855279.89300: done with get_vars() 30582 1726855279.89337: in VariableManager get_vars() 30582 1726855279.89351: done with get_vars() 30582 1726855279.89722: in VariableManager get_vars() 30582 1726855279.89736: done with get_vars() 30582 1726855279.89748: done processing included file 30582 1726855279.89750: iterating over new_blocks loaded from include file 30582 1726855279.89751: in VariableManager get_vars() 30582 1726855279.89760: done with get_vars() 30582 1726855279.89762: filtering new block on tags 30582 1726855279.89861: done filtering new block on tags 30582 1726855279.89864: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30582 1726855279.89869: extending task lists for all hosts with included blocks 30582 1726855279.89904: done extending task lists 30582 1726855279.89906: done processing included files 30582 1726855279.89906: results queue empty 30582 1726855279.89907: checking for any_errors_fatal 30582 1726855279.89912: done checking for any_errors_fatal 30582 1726855279.89913: checking for max_fail_percentage 30582 1726855279.89915: done checking for max_fail_percentage 30582 1726855279.89915: checking to see if all hosts have failed and the running result is not ok 30582 1726855279.89916: done checking to see if all hosts have failed 30582 1726855279.89917: getting the remaining hosts for this loop 30582 1726855279.89918: done getting the remaining hosts for this loop 30582 1726855279.89921: getting the next task for host managed_node3 30582 1726855279.89925: done getting next task for host managed_node3 30582 1726855279.89927: ^ task is: TASK: TEST: {{ lsr_description }} 30582 1726855279.89929: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855279.89931: getting variables 30582 1726855279.89932: in VariableManager get_vars() 30582 1726855279.89941: Calling all_inventory to load vars for managed_node3 30582 1726855279.89943: Calling groups_inventory to load vars for managed_node3 30582 1726855279.89945: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855279.89950: Calling all_plugins_play to load vars for managed_node3 30582 1726855279.89952: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855279.89955: Calling groups_plugins_play to load vars for managed_node3 30582 1726855279.91035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855279.92538: done with get_vars() 30582 1726855279.92565: done getting variables 30582 1726855279.92615: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855279.92732: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile without autoconnect] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 14:01:19 -0400 (0:00:00.091) 0:00:16.277 ****** 30582 1726855279.92762: entering _queue_task() for managed_node3/debug 30582 1726855279.93118: worker is 1 (out of 1 available) 30582 1726855279.93129: exiting _queue_task() for managed_node3/debug 30582 1726855279.93141: done queuing things up, now waiting for results queue to drain 30582 1726855279.93143: waiting for pending results... 30582 1726855279.93605: running TaskExecutor() for managed_node3/TASK: TEST: I can create a profile without autoconnect 30582 1726855279.93609: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005b4 30582 1726855279.93612: variable 'ansible_search_path' from source: unknown 30582 1726855279.93615: variable 'ansible_search_path' from source: unknown 30582 1726855279.93617: calling self._execute() 30582 1726855279.93670: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.93681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.93696: variable 'omit' from source: magic vars 30582 1726855279.94074: variable 'ansible_distribution_major_version' from source: facts 30582 1726855279.94093: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855279.94105: variable 'omit' from source: magic vars 30582 1726855279.94145: variable 'omit' from source: magic vars 30582 1726855279.94246: variable 'lsr_description' from source: include params 30582 1726855279.94273: variable 'omit' from source: magic vars 30582 1726855279.94324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855279.94365: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855279.94396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855279.94417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855279.94433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855279.94467: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855279.94475: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.94493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.94591: Set connection var ansible_timeout to 10 30582 1726855279.94692: Set connection var ansible_connection to ssh 30582 1726855279.94695: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855279.94697: Set connection var ansible_pipelining to False 30582 1726855279.94699: Set connection var ansible_shell_executable to /bin/sh 30582 1726855279.94702: Set connection var ansible_shell_type to sh 30582 1726855279.94708: variable 'ansible_shell_executable' from source: unknown 30582 1726855279.94711: variable 'ansible_connection' from source: unknown 30582 1726855279.94713: variable 'ansible_module_compression' from source: unknown 30582 1726855279.94715: variable 'ansible_shell_type' from source: unknown 30582 1726855279.94717: variable 'ansible_shell_executable' from source: unknown 30582 1726855279.94718: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.94720: variable 'ansible_pipelining' from source: unknown 30582 1726855279.94722: variable 'ansible_timeout' from source: unknown 30582 1726855279.94724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.94835: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855279.94850: variable 'omit' from source: magic vars 30582 1726855279.94859: starting attempt loop 30582 1726855279.94865: running the handler 30582 1726855279.94917: handler run complete 30582 1726855279.94940: attempt loop complete, returning result 30582 1726855279.94946: _execute() done 30582 1726855279.94952: dumping result to json 30582 1726855279.94959: done dumping result, returning 30582 1726855279.94970: done running TaskExecutor() for managed_node3/TASK: TEST: I can create a profile without autoconnect [0affcc66-ac2b-aa83-7d57-0000000005b4] 30582 1726855279.94978: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b4 30582 1726855279.95239: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b4 30582 1726855279.95243: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I can create a profile without autoconnect ########## 30582 1726855279.95292: no more pending results, returning what we have 30582 1726855279.95296: results queue empty 30582 1726855279.95297: checking for any_errors_fatal 30582 1726855279.95299: done checking for any_errors_fatal 30582 1726855279.95300: checking for max_fail_percentage 30582 1726855279.95302: done checking for max_fail_percentage 30582 1726855279.95302: checking to see if all hosts have failed and the running result is not ok 30582 1726855279.95303: done checking to see if all hosts have failed 30582 1726855279.95304: getting the remaining hosts for this loop 30582 1726855279.95305: done getting the remaining hosts for this loop 30582 1726855279.95310: getting the next task for host managed_node3 30582 1726855279.95318: done getting next task for host managed_node3 30582 1726855279.95321: ^ task is: TASK: Show item 30582 1726855279.95324: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855279.95328: getting variables 30582 1726855279.95330: in VariableManager get_vars() 30582 1726855279.95363: Calling all_inventory to load vars for managed_node3 30582 1726855279.95366: Calling groups_inventory to load vars for managed_node3 30582 1726855279.95369: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855279.95382: Calling all_plugins_play to load vars for managed_node3 30582 1726855279.95384: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855279.95390: Calling groups_plugins_play to load vars for managed_node3 30582 1726855279.96939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855279.98529: done with get_vars() 30582 1726855279.98559: done getting variables 30582 1726855279.98627: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 14:01:19 -0400 (0:00:00.058) 0:00:16.336 ****** 30582 1726855279.98660: entering _queue_task() for managed_node3/debug 30582 1726855279.99021: worker is 1 (out of 1 available) 30582 1726855279.99034: exiting _queue_task() for managed_node3/debug 30582 1726855279.99045: done queuing things up, now waiting for results queue to drain 30582 1726855279.99047: waiting for pending results... 30582 1726855279.99328: running TaskExecutor() for managed_node3/TASK: Show item 30582 1726855279.99495: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005b5 30582 1726855279.99499: variable 'ansible_search_path' from source: unknown 30582 1726855279.99502: variable 'ansible_search_path' from source: unknown 30582 1726855279.99514: variable 'omit' from source: magic vars 30582 1726855279.99654: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855279.99668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855279.99682: variable 'omit' from source: magic vars 30582 1726855280.00035: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.00057: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.00067: variable 'omit' from source: magic vars 30582 1726855280.00108: variable 'omit' from source: magic vars 30582 1726855280.00155: variable 'item' from source: unknown 30582 1726855280.00232: variable 'item' from source: unknown 30582 1726855280.00274: variable 'omit' from source: magic vars 30582 1726855280.00305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855280.00346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.00382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855280.00397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.00412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.00692: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.00696: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.00698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.00700: Set connection var ansible_timeout to 10 30582 1726855280.00702: Set connection var ansible_connection to ssh 30582 1726855280.00704: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.00706: Set connection var ansible_pipelining to False 30582 1726855280.00708: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.00710: Set connection var ansible_shell_type to sh 30582 1726855280.00712: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.00714: variable 'ansible_connection' from source: unknown 30582 1726855280.00716: variable 'ansible_module_compression' from source: unknown 30582 1726855280.00718: variable 'ansible_shell_type' from source: unknown 30582 1726855280.00719: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.00721: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.00723: variable 'ansible_pipelining' from source: unknown 30582 1726855280.00725: variable 'ansible_timeout' from source: unknown 30582 1726855280.00727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.00794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.00810: variable 'omit' from source: magic vars 30582 1726855280.00820: starting attempt loop 30582 1726855280.00826: running the handler 30582 1726855280.00877: variable 'lsr_description' from source: include params 30582 1726855280.00953: variable 'lsr_description' from source: include params 30582 1726855280.00967: handler run complete 30582 1726855280.00991: attempt loop complete, returning result 30582 1726855280.01014: variable 'item' from source: unknown 30582 1726855280.01089: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile without autoconnect" } 30582 1726855280.01493: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.01497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.01501: variable 'omit' from source: magic vars 30582 1726855280.01521: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.01534: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.01617: variable 'omit' from source: magic vars 30582 1726855280.01620: variable 'omit' from source: magic vars 30582 1726855280.01623: variable 'item' from source: unknown 30582 1726855280.01674: variable 'item' from source: unknown 30582 1726855280.01697: variable 'omit' from source: magic vars 30582 1726855280.01728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.01743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.01756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.01775: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.01784: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.01795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.01882: Set connection var ansible_timeout to 10 30582 1726855280.01894: Set connection var ansible_connection to ssh 30582 1726855280.01908: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.01944: Set connection var ansible_pipelining to False 30582 1726855280.01947: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.01949: Set connection var ansible_shell_type to sh 30582 1726855280.01959: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.01965: variable 'ansible_connection' from source: unknown 30582 1726855280.01972: variable 'ansible_module_compression' from source: unknown 30582 1726855280.02052: variable 'ansible_shell_type' from source: unknown 30582 1726855280.02055: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.02058: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.02060: variable 'ansible_pipelining' from source: unknown 30582 1726855280.02061: variable 'ansible_timeout' from source: unknown 30582 1726855280.02063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.02100: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.02114: variable 'omit' from source: magic vars 30582 1726855280.02122: starting attempt loop 30582 1726855280.02127: running the handler 30582 1726855280.02152: variable 'lsr_setup' from source: include params 30582 1726855280.02228: variable 'lsr_setup' from source: include params 30582 1726855280.02281: handler run complete 30582 1726855280.02302: attempt loop complete, returning result 30582 1726855280.02321: variable 'item' from source: unknown 30582 1726855280.02388: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30582 1726855280.02606: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.02609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.02611: variable 'omit' from source: magic vars 30582 1726855280.02712: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.02724: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.02730: variable 'omit' from source: magic vars 30582 1726855280.02748: variable 'omit' from source: magic vars 30582 1726855280.02782: variable 'item' from source: unknown 30582 1726855280.02843: variable 'item' from source: unknown 30582 1726855280.02937: variable 'omit' from source: magic vars 30582 1726855280.02941: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.02943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.02945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.02948: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.02950: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.02952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.03011: Set connection var ansible_timeout to 10 30582 1726855280.03019: Set connection var ansible_connection to ssh 30582 1726855280.03031: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.03045: Set connection var ansible_pipelining to False 30582 1726855280.03055: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.03062: Set connection var ansible_shell_type to sh 30582 1726855280.03085: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.03096: variable 'ansible_connection' from source: unknown 30582 1726855280.03104: variable 'ansible_module_compression' from source: unknown 30582 1726855280.03110: variable 'ansible_shell_type' from source: unknown 30582 1726855280.03117: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.03123: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.03154: variable 'ansible_pipelining' from source: unknown 30582 1726855280.03157: variable 'ansible_timeout' from source: unknown 30582 1726855280.03160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.03240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.03254: variable 'omit' from source: magic vars 30582 1726855280.03371: starting attempt loop 30582 1726855280.03375: running the handler 30582 1726855280.03377: variable 'lsr_test' from source: include params 30582 1726855280.03379: variable 'lsr_test' from source: include params 30582 1726855280.03381: handler run complete 30582 1726855280.03399: attempt loop complete, returning result 30582 1726855280.03418: variable 'item' from source: unknown 30582 1726855280.03489: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile_no_autoconnect.yml" ] } 30582 1726855280.03645: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.03693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.03697: variable 'omit' from source: magic vars 30582 1726855280.03830: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.03841: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.03849: variable 'omit' from source: magic vars 30582 1726855280.03872: variable 'omit' from source: magic vars 30582 1726855280.03992: variable 'item' from source: unknown 30582 1726855280.03995: variable 'item' from source: unknown 30582 1726855280.03997: variable 'omit' from source: magic vars 30582 1726855280.04022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.04034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.04045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.04060: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.04067: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.04075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.04157: Set connection var ansible_timeout to 10 30582 1726855280.04165: Set connection var ansible_connection to ssh 30582 1726855280.04178: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.04189: Set connection var ansible_pipelining to False 30582 1726855280.04200: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.04207: Set connection var ansible_shell_type to sh 30582 1726855280.04231: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.04242: variable 'ansible_connection' from source: unknown 30582 1726855280.04347: variable 'ansible_module_compression' from source: unknown 30582 1726855280.04350: variable 'ansible_shell_type' from source: unknown 30582 1726855280.04353: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.04355: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.04357: variable 'ansible_pipelining' from source: unknown 30582 1726855280.04359: variable 'ansible_timeout' from source: unknown 30582 1726855280.04361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.04373: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.04385: variable 'omit' from source: magic vars 30582 1726855280.04397: starting attempt loop 30582 1726855280.04403: running the handler 30582 1726855280.04426: variable 'lsr_assert' from source: include params 30582 1726855280.04493: variable 'lsr_assert' from source: include params 30582 1726855280.04516: handler run complete 30582 1726855280.04534: attempt loop complete, returning result 30582 1726855280.04552: variable 'item' from source: unknown 30582 1726855280.04617: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_absent.yml", "tasks/assert_profile_present.yml" ] } 30582 1726855280.04892: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.04895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.04898: variable 'omit' from source: magic vars 30582 1726855280.04991: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.05002: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.05021: variable 'omit' from source: magic vars 30582 1726855280.05039: variable 'omit' from source: magic vars 30582 1726855280.05083: variable 'item' from source: unknown 30582 1726855280.05149: variable 'item' from source: unknown 30582 1726855280.05230: variable 'omit' from source: magic vars 30582 1726855280.05233: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.05236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.05238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.05240: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.05242: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.05245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.05315: Set connection var ansible_timeout to 10 30582 1726855280.05323: Set connection var ansible_connection to ssh 30582 1726855280.05338: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.05349: Set connection var ansible_pipelining to False 30582 1726855280.05358: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.05364: Set connection var ansible_shell_type to sh 30582 1726855280.05389: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.05398: variable 'ansible_connection' from source: unknown 30582 1726855280.05406: variable 'ansible_module_compression' from source: unknown 30582 1726855280.05447: variable 'ansible_shell_type' from source: unknown 30582 1726855280.05450: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.05452: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.05454: variable 'ansible_pipelining' from source: unknown 30582 1726855280.05456: variable 'ansible_timeout' from source: unknown 30582 1726855280.05458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.05538: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.05555: variable 'omit' from source: magic vars 30582 1726855280.05564: starting attempt loop 30582 1726855280.05571: running the handler 30582 1726855280.05773: handler run complete 30582 1726855280.05776: attempt loop complete, returning result 30582 1726855280.05779: variable 'item' from source: unknown 30582 1726855280.05781: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30582 1726855280.05926: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.05939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.05992: variable 'omit' from source: magic vars 30582 1726855280.06100: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.06112: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.06120: variable 'omit' from source: magic vars 30582 1726855280.06138: variable 'omit' from source: magic vars 30582 1726855280.06180: variable 'item' from source: unknown 30582 1726855280.06239: variable 'item' from source: unknown 30582 1726855280.06254: variable 'omit' from source: magic vars 30582 1726855280.06292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.06295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.06297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.06305: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.06316: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.06423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.06426: Set connection var ansible_timeout to 10 30582 1726855280.06428: Set connection var ansible_connection to ssh 30582 1726855280.06430: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.06432: Set connection var ansible_pipelining to False 30582 1726855280.06433: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.06435: Set connection var ansible_shell_type to sh 30582 1726855280.06449: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.06455: variable 'ansible_connection' from source: unknown 30582 1726855280.06460: variable 'ansible_module_compression' from source: unknown 30582 1726855280.06465: variable 'ansible_shell_type' from source: unknown 30582 1726855280.06470: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.06475: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.06480: variable 'ansible_pipelining' from source: unknown 30582 1726855280.06485: variable 'ansible_timeout' from source: unknown 30582 1726855280.06493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.06581: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.06595: variable 'omit' from source: magic vars 30582 1726855280.06603: starting attempt loop 30582 1726855280.06609: running the handler 30582 1726855280.06628: variable 'lsr_fail_debug' from source: play vars 30582 1726855280.06693: variable 'lsr_fail_debug' from source: play vars 30582 1726855280.06715: handler run complete 30582 1726855280.06732: attempt loop complete, returning result 30582 1726855280.06753: variable 'item' from source: unknown 30582 1726855280.06813: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30582 1726855280.07049: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.07052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.07055: variable 'omit' from source: magic vars 30582 1726855280.07140: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.07150: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.07163: variable 'omit' from source: magic vars 30582 1726855280.07182: variable 'omit' from source: magic vars 30582 1726855280.07225: variable 'item' from source: unknown 30582 1726855280.07291: variable 'item' from source: unknown 30582 1726855280.07310: variable 'omit' from source: magic vars 30582 1726855280.07331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.07343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.07353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.07377: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.07385: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.07395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.07472: Set connection var ansible_timeout to 10 30582 1726855280.07594: Set connection var ansible_connection to ssh 30582 1726855280.07597: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.07599: Set connection var ansible_pipelining to False 30582 1726855280.07601: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.07603: Set connection var ansible_shell_type to sh 30582 1726855280.07605: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.07607: variable 'ansible_connection' from source: unknown 30582 1726855280.07608: variable 'ansible_module_compression' from source: unknown 30582 1726855280.07610: variable 'ansible_shell_type' from source: unknown 30582 1726855280.07612: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.07614: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.07616: variable 'ansible_pipelining' from source: unknown 30582 1726855280.07617: variable 'ansible_timeout' from source: unknown 30582 1726855280.07619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.07660: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.07674: variable 'omit' from source: magic vars 30582 1726855280.07682: starting attempt loop 30582 1726855280.07691: running the handler 30582 1726855280.07718: variable 'lsr_cleanup' from source: include params 30582 1726855280.07780: variable 'lsr_cleanup' from source: include params 30582 1726855280.07807: handler run complete 30582 1726855280.07828: attempt loop complete, returning result 30582 1726855280.07847: variable 'item' from source: unknown 30582 1726855280.07910: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30582 1726855280.08138: dumping result to json 30582 1726855280.08141: done dumping result, returning 30582 1726855280.08144: done running TaskExecutor() for managed_node3/TASK: Show item [0affcc66-ac2b-aa83-7d57-0000000005b5] 30582 1726855280.08146: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b5 30582 1726855280.08197: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b5 30582 1726855280.08200: WORKER PROCESS EXITING 30582 1726855280.08295: no more pending results, returning what we have 30582 1726855280.08299: results queue empty 30582 1726855280.08300: checking for any_errors_fatal 30582 1726855280.08307: done checking for any_errors_fatal 30582 1726855280.08308: checking for max_fail_percentage 30582 1726855280.08309: done checking for max_fail_percentage 30582 1726855280.08310: checking to see if all hosts have failed and the running result is not ok 30582 1726855280.08311: done checking to see if all hosts have failed 30582 1726855280.08312: getting the remaining hosts for this loop 30582 1726855280.08314: done getting the remaining hosts for this loop 30582 1726855280.08317: getting the next task for host managed_node3 30582 1726855280.08325: done getting next task for host managed_node3 30582 1726855280.08328: ^ task is: TASK: Include the task 'show_interfaces.yml' 30582 1726855280.08331: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855280.08335: getting variables 30582 1726855280.08336: in VariableManager get_vars() 30582 1726855280.08368: Calling all_inventory to load vars for managed_node3 30582 1726855280.08371: Calling groups_inventory to load vars for managed_node3 30582 1726855280.08375: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.08387: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.08391: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.08394: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.09889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.11534: done with get_vars() 30582 1726855280.11557: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 14:01:20 -0400 (0:00:00.129) 0:00:16.466 ****** 30582 1726855280.11657: entering _queue_task() for managed_node3/include_tasks 30582 1726855280.12008: worker is 1 (out of 1 available) 30582 1726855280.12022: exiting _queue_task() for managed_node3/include_tasks 30582 1726855280.12034: done queuing things up, now waiting for results queue to drain 30582 1726855280.12035: waiting for pending results... 30582 1726855280.12328: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30582 1726855280.12496: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005b6 30582 1726855280.12499: variable 'ansible_search_path' from source: unknown 30582 1726855280.12501: variable 'ansible_search_path' from source: unknown 30582 1726855280.12508: calling self._execute() 30582 1726855280.12582: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.12595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.12610: variable 'omit' from source: magic vars 30582 1726855280.12976: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.12995: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.13059: _execute() done 30582 1726855280.13062: dumping result to json 30582 1726855280.13065: done dumping result, returning 30582 1726855280.13067: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcc66-ac2b-aa83-7d57-0000000005b6] 30582 1726855280.13069: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b6 30582 1726855280.13144: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b6 30582 1726855280.13147: WORKER PROCESS EXITING 30582 1726855280.13192: no more pending results, returning what we have 30582 1726855280.13198: in VariableManager get_vars() 30582 1726855280.13237: Calling all_inventory to load vars for managed_node3 30582 1726855280.13240: Calling groups_inventory to load vars for managed_node3 30582 1726855280.13244: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.13258: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.13261: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.13264: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.14852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.16407: done with get_vars() 30582 1726855280.16428: variable 'ansible_search_path' from source: unknown 30582 1726855280.16429: variable 'ansible_search_path' from source: unknown 30582 1726855280.16468: we have included files to process 30582 1726855280.16469: generating all_blocks data 30582 1726855280.16471: done generating all_blocks data 30582 1726855280.16476: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855280.16477: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855280.16480: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855280.16583: in VariableManager get_vars() 30582 1726855280.16604: done with get_vars() 30582 1726855280.16716: done processing included file 30582 1726855280.16718: iterating over new_blocks loaded from include file 30582 1726855280.16720: in VariableManager get_vars() 30582 1726855280.16733: done with get_vars() 30582 1726855280.16734: filtering new block on tags 30582 1726855280.16767: done filtering new block on tags 30582 1726855280.16770: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30582 1726855280.16775: extending task lists for all hosts with included blocks 30582 1726855280.17214: done extending task lists 30582 1726855280.17216: done processing included files 30582 1726855280.17216: results queue empty 30582 1726855280.17217: checking for any_errors_fatal 30582 1726855280.17223: done checking for any_errors_fatal 30582 1726855280.17224: checking for max_fail_percentage 30582 1726855280.17225: done checking for max_fail_percentage 30582 1726855280.17226: checking to see if all hosts have failed and the running result is not ok 30582 1726855280.17227: done checking to see if all hosts have failed 30582 1726855280.17228: getting the remaining hosts for this loop 30582 1726855280.17229: done getting the remaining hosts for this loop 30582 1726855280.17231: getting the next task for host managed_node3 30582 1726855280.17235: done getting next task for host managed_node3 30582 1726855280.17237: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30582 1726855280.17240: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855280.17243: getting variables 30582 1726855280.17244: in VariableManager get_vars() 30582 1726855280.17253: Calling all_inventory to load vars for managed_node3 30582 1726855280.17256: Calling groups_inventory to load vars for managed_node3 30582 1726855280.17258: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.17263: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.17265: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.17268: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.18486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.20012: done with get_vars() 30582 1726855280.20037: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 14:01:20 -0400 (0:00:00.084) 0:00:16.551 ****** 30582 1726855280.20113: entering _queue_task() for managed_node3/include_tasks 30582 1726855280.20477: worker is 1 (out of 1 available) 30582 1726855280.20694: exiting _queue_task() for managed_node3/include_tasks 30582 1726855280.20705: done queuing things up, now waiting for results queue to drain 30582 1726855280.20707: waiting for pending results... 30582 1726855280.20906: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30582 1726855280.20943: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005dd 30582 1726855280.20964: variable 'ansible_search_path' from source: unknown 30582 1726855280.20972: variable 'ansible_search_path' from source: unknown 30582 1726855280.21015: calling self._execute() 30582 1726855280.21114: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.21152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.21156: variable 'omit' from source: magic vars 30582 1726855280.21516: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.21534: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.21589: _execute() done 30582 1726855280.21592: dumping result to json 30582 1726855280.21595: done dumping result, returning 30582 1726855280.21597: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcc66-ac2b-aa83-7d57-0000000005dd] 30582 1726855280.21600: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005dd 30582 1726855280.21818: no more pending results, returning what we have 30582 1726855280.21824: in VariableManager get_vars() 30582 1726855280.21864: Calling all_inventory to load vars for managed_node3 30582 1726855280.21868: Calling groups_inventory to load vars for managed_node3 30582 1726855280.21872: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.21891: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.21895: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.21898: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.22499: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005dd 30582 1726855280.22502: WORKER PROCESS EXITING 30582 1726855280.23382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.24944: done with get_vars() 30582 1726855280.24968: variable 'ansible_search_path' from source: unknown 30582 1726855280.24970: variable 'ansible_search_path' from source: unknown 30582 1726855280.25010: we have included files to process 30582 1726855280.25011: generating all_blocks data 30582 1726855280.25013: done generating all_blocks data 30582 1726855280.25015: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855280.25016: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855280.25018: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855280.25275: done processing included file 30582 1726855280.25277: iterating over new_blocks loaded from include file 30582 1726855280.25279: in VariableManager get_vars() 30582 1726855280.25295: done with get_vars() 30582 1726855280.25297: filtering new block on tags 30582 1726855280.25331: done filtering new block on tags 30582 1726855280.25334: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30582 1726855280.25339: extending task lists for all hosts with included blocks 30582 1726855280.25496: done extending task lists 30582 1726855280.25498: done processing included files 30582 1726855280.25499: results queue empty 30582 1726855280.25499: checking for any_errors_fatal 30582 1726855280.25502: done checking for any_errors_fatal 30582 1726855280.25503: checking for max_fail_percentage 30582 1726855280.25504: done checking for max_fail_percentage 30582 1726855280.25505: checking to see if all hosts have failed and the running result is not ok 30582 1726855280.25506: done checking to see if all hosts have failed 30582 1726855280.25506: getting the remaining hosts for this loop 30582 1726855280.25508: done getting the remaining hosts for this loop 30582 1726855280.25510: getting the next task for host managed_node3 30582 1726855280.25514: done getting next task for host managed_node3 30582 1726855280.25516: ^ task is: TASK: Gather current interface info 30582 1726855280.25520: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855280.25522: getting variables 30582 1726855280.25523: in VariableManager get_vars() 30582 1726855280.25532: Calling all_inventory to load vars for managed_node3 30582 1726855280.25534: Calling groups_inventory to load vars for managed_node3 30582 1726855280.25536: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.25542: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.25544: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.25547: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.26746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.28285: done with get_vars() 30582 1726855280.28313: done getting variables 30582 1726855280.28359: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 14:01:20 -0400 (0:00:00.082) 0:00:16.633 ****** 30582 1726855280.28395: entering _queue_task() for managed_node3/command 30582 1726855280.28741: worker is 1 (out of 1 available) 30582 1726855280.28754: exiting _queue_task() for managed_node3/command 30582 1726855280.28765: done queuing things up, now waiting for results queue to drain 30582 1726855280.28767: waiting for pending results... 30582 1726855280.29060: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30582 1726855280.29199: in run() - task 0affcc66-ac2b-aa83-7d57-000000000618 30582 1726855280.29219: variable 'ansible_search_path' from source: unknown 30582 1726855280.29228: variable 'ansible_search_path' from source: unknown 30582 1726855280.29264: calling self._execute() 30582 1726855280.29353: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.29364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.29377: variable 'omit' from source: magic vars 30582 1726855280.29739: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.29758: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.29776: variable 'omit' from source: magic vars 30582 1726855280.29827: variable 'omit' from source: magic vars 30582 1726855280.29859: variable 'omit' from source: magic vars 30582 1726855280.29906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855280.29942: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.29965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855280.30094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.30098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.30101: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.30104: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.30107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.30157: Set connection var ansible_timeout to 10 30582 1726855280.30166: Set connection var ansible_connection to ssh 30582 1726855280.30179: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.30191: Set connection var ansible_pipelining to False 30582 1726855280.30202: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.30210: Set connection var ansible_shell_type to sh 30582 1726855280.30242: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.30250: variable 'ansible_connection' from source: unknown 30582 1726855280.30257: variable 'ansible_module_compression' from source: unknown 30582 1726855280.30264: variable 'ansible_shell_type' from source: unknown 30582 1726855280.30273: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.30279: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.30288: variable 'ansible_pipelining' from source: unknown 30582 1726855280.30297: variable 'ansible_timeout' from source: unknown 30582 1726855280.30305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.30461: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.30548: variable 'omit' from source: magic vars 30582 1726855280.30551: starting attempt loop 30582 1726855280.30553: running the handler 30582 1726855280.30556: _low_level_execute_command(): starting 30582 1726855280.30558: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855280.31322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.31371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855280.31406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855280.31515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855280.33205: stdout chunk (state=3): >>>/root <<< 30582 1726855280.33307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855280.33343: stderr chunk (state=3): >>><<< 30582 1726855280.33346: stdout chunk (state=3): >>><<< 30582 1726855280.33359: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855280.33377: _low_level_execute_command(): starting 30582 1726855280.33392: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006 `" && echo ansible-tmp-1726855280.3336453-31314-152793759312006="` echo /root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006 `" ) && sleep 0' 30582 1726855280.33836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855280.33840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855280.33842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855280.33846: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855280.33855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.33901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855280.33904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855280.33906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855280.33964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855280.35869: stdout chunk (state=3): >>>ansible-tmp-1726855280.3336453-31314-152793759312006=/root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006 <<< 30582 1726855280.35972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855280.36005: stderr chunk (state=3): >>><<< 30582 1726855280.36008: stdout chunk (state=3): >>><<< 30582 1726855280.36024: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855280.3336453-31314-152793759312006=/root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855280.36051: variable 'ansible_module_compression' from source: unknown 30582 1726855280.36093: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855280.36127: variable 'ansible_facts' from source: unknown 30582 1726855280.36179: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/AnsiballZ_command.py 30582 1726855280.36282: Sending initial data 30582 1726855280.36285: Sent initial data (156 bytes) 30582 1726855280.36719: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855280.36722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.36725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855280.36727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855280.36729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.36780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855280.36786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855280.36840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855280.38444: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855280.38511: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855280.38565: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpc78lwqhh /root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/AnsiballZ_command.py <<< 30582 1726855280.38568: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/AnsiballZ_command.py" <<< 30582 1726855280.38628: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpc78lwqhh" to remote "/root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/AnsiballZ_command.py" <<< 30582 1726855280.38631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/AnsiballZ_command.py" <<< 30582 1726855280.39223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855280.39262: stderr chunk (state=3): >>><<< 30582 1726855280.39265: stdout chunk (state=3): >>><<< 30582 1726855280.39281: done transferring module to remote 30582 1726855280.39297: _low_level_execute_command(): starting 30582 1726855280.39301: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/ /root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/AnsiballZ_command.py && sleep 0' 30582 1726855280.39729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855280.39733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855280.39764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855280.39767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.39769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855280.39772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855280.39774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.39825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855280.39834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855280.39900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855280.41678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855280.41706: stderr chunk (state=3): >>><<< 30582 1726855280.41709: stdout chunk (state=3): >>><<< 30582 1726855280.41727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855280.41730: _low_level_execute_command(): starting 30582 1726855280.41735: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/AnsiballZ_command.py && sleep 0' 30582 1726855280.42157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855280.42161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.42163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855280.42165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855280.42167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.42217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855280.42220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855280.42294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855280.57952: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:01:20.575030", "end": "2024-09-20 14:01:20.578399", "delta": "0:00:00.003369", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855280.59470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855280.59494: stderr chunk (state=3): >>><<< 30582 1726855280.59498: stdout chunk (state=3): >>><<< 30582 1726855280.59514: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:01:20.575030", "end": "2024-09-20 14:01:20.578399", "delta": "0:00:00.003369", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855280.59548: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855280.59554: _low_level_execute_command(): starting 30582 1726855280.59560: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855280.3336453-31314-152793759312006/ > /dev/null 2>&1 && sleep 0' 30582 1726855280.59989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855280.60018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855280.60021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.60023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855280.60029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.60081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855280.60089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855280.60092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855280.60145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855280.62194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855280.62197: stdout chunk (state=3): >>><<< 30582 1726855280.62200: stderr chunk (state=3): >>><<< 30582 1726855280.62202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855280.62204: handler run complete 30582 1726855280.62206: Evaluated conditional (False): False 30582 1726855280.62207: attempt loop complete, returning result 30582 1726855280.62209: _execute() done 30582 1726855280.62211: dumping result to json 30582 1726855280.62213: done dumping result, returning 30582 1726855280.62214: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcc66-ac2b-aa83-7d57-000000000618] 30582 1726855280.62216: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000618 30582 1726855280.62292: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000618 30582 1726855280.62295: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003369", "end": "2024-09-20 14:01:20.578399", "rc": 0, "start": "2024-09-20 14:01:20.575030" } STDOUT: bonding_masters eth0 lo rpltstbr 30582 1726855280.62375: no more pending results, returning what we have 30582 1726855280.62379: results queue empty 30582 1726855280.62380: checking for any_errors_fatal 30582 1726855280.62382: done checking for any_errors_fatal 30582 1726855280.62382: checking for max_fail_percentage 30582 1726855280.62385: done checking for max_fail_percentage 30582 1726855280.62385: checking to see if all hosts have failed and the running result is not ok 30582 1726855280.62386: done checking to see if all hosts have failed 30582 1726855280.62389: getting the remaining hosts for this loop 30582 1726855280.62390: done getting the remaining hosts for this loop 30582 1726855280.62394: getting the next task for host managed_node3 30582 1726855280.62403: done getting next task for host managed_node3 30582 1726855280.62410: ^ task is: TASK: Set current_interfaces 30582 1726855280.62415: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855280.62421: getting variables 30582 1726855280.62422: in VariableManager get_vars() 30582 1726855280.62456: Calling all_inventory to load vars for managed_node3 30582 1726855280.62459: Calling groups_inventory to load vars for managed_node3 30582 1726855280.62462: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.62476: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.62479: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.62483: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.63903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.65659: done with get_vars() 30582 1726855280.65692: done getting variables 30582 1726855280.65779: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 14:01:20 -0400 (0:00:00.374) 0:00:17.008 ****** 30582 1726855280.65820: entering _queue_task() for managed_node3/set_fact 30582 1726855280.66380: worker is 1 (out of 1 available) 30582 1726855280.66395: exiting _queue_task() for managed_node3/set_fact 30582 1726855280.66405: done queuing things up, now waiting for results queue to drain 30582 1726855280.66406: waiting for pending results... 30582 1726855280.66698: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30582 1726855280.66795: in run() - task 0affcc66-ac2b-aa83-7d57-000000000619 30582 1726855280.66798: variable 'ansible_search_path' from source: unknown 30582 1726855280.66801: variable 'ansible_search_path' from source: unknown 30582 1726855280.66804: calling self._execute() 30582 1726855280.66899: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.66913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.66926: variable 'omit' from source: magic vars 30582 1726855280.67329: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.67350: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.67360: variable 'omit' from source: magic vars 30582 1726855280.67437: variable 'omit' from source: magic vars 30582 1726855280.67543: variable '_current_interfaces' from source: set_fact 30582 1726855280.67658: variable 'omit' from source: magic vars 30582 1726855280.67674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855280.67719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.67744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855280.67775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.67796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.67874: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.67885: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.67889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.67963: Set connection var ansible_timeout to 10 30582 1726855280.67971: Set connection var ansible_connection to ssh 30582 1726855280.67999: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.68009: Set connection var ansible_pipelining to False 30582 1726855280.68019: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.68025: Set connection var ansible_shell_type to sh 30582 1726855280.68092: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.68100: variable 'ansible_connection' from source: unknown 30582 1726855280.68101: variable 'ansible_module_compression' from source: unknown 30582 1726855280.68103: variable 'ansible_shell_type' from source: unknown 30582 1726855280.68105: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.68106: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.68108: variable 'ansible_pipelining' from source: unknown 30582 1726855280.68109: variable 'ansible_timeout' from source: unknown 30582 1726855280.68110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.68221: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.68236: variable 'omit' from source: magic vars 30582 1726855280.68244: starting attempt loop 30582 1726855280.68251: running the handler 30582 1726855280.68268: handler run complete 30582 1726855280.68310: attempt loop complete, returning result 30582 1726855280.68319: _execute() done 30582 1726855280.68321: dumping result to json 30582 1726855280.68323: done dumping result, returning 30582 1726855280.68329: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcc66-ac2b-aa83-7d57-000000000619] 30582 1726855280.68419: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000619 30582 1726855280.68502: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000619 30582 1726855280.68506: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30582 1726855280.68576: no more pending results, returning what we have 30582 1726855280.68580: results queue empty 30582 1726855280.68584: checking for any_errors_fatal 30582 1726855280.68595: done checking for any_errors_fatal 30582 1726855280.68596: checking for max_fail_percentage 30582 1726855280.68599: done checking for max_fail_percentage 30582 1726855280.68600: checking to see if all hosts have failed and the running result is not ok 30582 1726855280.68601: done checking to see if all hosts have failed 30582 1726855280.68602: getting the remaining hosts for this loop 30582 1726855280.68603: done getting the remaining hosts for this loop 30582 1726855280.68608: getting the next task for host managed_node3 30582 1726855280.68619: done getting next task for host managed_node3 30582 1726855280.68623: ^ task is: TASK: Show current_interfaces 30582 1726855280.68627: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855280.68632: getting variables 30582 1726855280.68633: in VariableManager get_vars() 30582 1726855280.68668: Calling all_inventory to load vars for managed_node3 30582 1726855280.68672: Calling groups_inventory to load vars for managed_node3 30582 1726855280.68675: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.68794: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.68803: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.68808: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.70547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.72114: done with get_vars() 30582 1726855280.72141: done getting variables 30582 1726855280.72212: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 14:01:20 -0400 (0:00:00.064) 0:00:17.072 ****** 30582 1726855280.72244: entering _queue_task() for managed_node3/debug 30582 1726855280.72617: worker is 1 (out of 1 available) 30582 1726855280.72632: exiting _queue_task() for managed_node3/debug 30582 1726855280.72644: done queuing things up, now waiting for results queue to drain 30582 1726855280.72646: waiting for pending results... 30582 1726855280.73007: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30582 1726855280.73012: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005de 30582 1726855280.73015: variable 'ansible_search_path' from source: unknown 30582 1726855280.73018: variable 'ansible_search_path' from source: unknown 30582 1726855280.73041: calling self._execute() 30582 1726855280.73130: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.73142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.73192: variable 'omit' from source: magic vars 30582 1726855280.73505: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.73521: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.73531: variable 'omit' from source: magic vars 30582 1726855280.73577: variable 'omit' from source: magic vars 30582 1726855280.73673: variable 'current_interfaces' from source: set_fact 30582 1726855280.73707: variable 'omit' from source: magic vars 30582 1726855280.73992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855280.73996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.73998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855280.74000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.74002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.74004: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.74006: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.74008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.74010: Set connection var ansible_timeout to 10 30582 1726855280.74012: Set connection var ansible_connection to ssh 30582 1726855280.74014: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.74016: Set connection var ansible_pipelining to False 30582 1726855280.74019: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.74022: Set connection var ansible_shell_type to sh 30582 1726855280.74036: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.74043: variable 'ansible_connection' from source: unknown 30582 1726855280.74049: variable 'ansible_module_compression' from source: unknown 30582 1726855280.74054: variable 'ansible_shell_type' from source: unknown 30582 1726855280.74060: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.74066: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.74072: variable 'ansible_pipelining' from source: unknown 30582 1726855280.74078: variable 'ansible_timeout' from source: unknown 30582 1726855280.74085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.74218: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.74234: variable 'omit' from source: magic vars 30582 1726855280.74244: starting attempt loop 30582 1726855280.74250: running the handler 30582 1726855280.74301: handler run complete 30582 1726855280.74318: attempt loop complete, returning result 30582 1726855280.74325: _execute() done 30582 1726855280.74330: dumping result to json 30582 1726855280.74337: done dumping result, returning 30582 1726855280.74348: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcc66-ac2b-aa83-7d57-0000000005de] 30582 1726855280.74358: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005de 30582 1726855280.74455: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005de ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30582 1726855280.74509: no more pending results, returning what we have 30582 1726855280.74512: results queue empty 30582 1726855280.74513: checking for any_errors_fatal 30582 1726855280.74521: done checking for any_errors_fatal 30582 1726855280.74522: checking for max_fail_percentage 30582 1726855280.74524: done checking for max_fail_percentage 30582 1726855280.74525: checking to see if all hosts have failed and the running result is not ok 30582 1726855280.74525: done checking to see if all hosts have failed 30582 1726855280.74526: getting the remaining hosts for this loop 30582 1726855280.74528: done getting the remaining hosts for this loop 30582 1726855280.74532: getting the next task for host managed_node3 30582 1726855280.74540: done getting next task for host managed_node3 30582 1726855280.74543: ^ task is: TASK: Setup 30582 1726855280.74545: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855280.74549: getting variables 30582 1726855280.74551: in VariableManager get_vars() 30582 1726855280.74693: Calling all_inventory to load vars for managed_node3 30582 1726855280.74696: Calling groups_inventory to load vars for managed_node3 30582 1726855280.74698: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.74708: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.74711: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.74714: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.75231: WORKER PROCESS EXITING 30582 1726855280.75984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.77785: done with get_vars() 30582 1726855280.77807: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 14:01:20 -0400 (0:00:00.056) 0:00:17.128 ****** 30582 1726855280.77902: entering _queue_task() for managed_node3/include_tasks 30582 1726855280.78225: worker is 1 (out of 1 available) 30582 1726855280.78238: exiting _queue_task() for managed_node3/include_tasks 30582 1726855280.78250: done queuing things up, now waiting for results queue to drain 30582 1726855280.78252: waiting for pending results... 30582 1726855280.78482: running TaskExecutor() for managed_node3/TASK: Setup 30582 1726855280.78592: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005b7 30582 1726855280.78611: variable 'ansible_search_path' from source: unknown 30582 1726855280.78618: variable 'ansible_search_path' from source: unknown 30582 1726855280.78661: variable 'lsr_setup' from source: include params 30582 1726855280.78858: variable 'lsr_setup' from source: include params 30582 1726855280.78935: variable 'omit' from source: magic vars 30582 1726855280.79119: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.79123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.79127: variable 'omit' from source: magic vars 30582 1726855280.79341: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.79358: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.79369: variable 'item' from source: unknown 30582 1726855280.79436: variable 'item' from source: unknown 30582 1726855280.79480: variable 'item' from source: unknown 30582 1726855280.79546: variable 'item' from source: unknown 30582 1726855280.79762: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.79993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.79997: variable 'omit' from source: magic vars 30582 1726855280.79999: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.80002: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.80004: variable 'item' from source: unknown 30582 1726855280.80006: variable 'item' from source: unknown 30582 1726855280.80033: variable 'item' from source: unknown 30582 1726855280.80104: variable 'item' from source: unknown 30582 1726855280.80184: dumping result to json 30582 1726855280.80196: done dumping result, returning 30582 1726855280.80207: done running TaskExecutor() for managed_node3/TASK: Setup [0affcc66-ac2b-aa83-7d57-0000000005b7] 30582 1726855280.80218: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b7 30582 1726855280.80361: no more pending results, returning what we have 30582 1726855280.80367: in VariableManager get_vars() 30582 1726855280.80408: Calling all_inventory to load vars for managed_node3 30582 1726855280.80411: Calling groups_inventory to load vars for managed_node3 30582 1726855280.80415: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.80431: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.80435: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.80439: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.81117: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b7 30582 1726855280.81121: WORKER PROCESS EXITING 30582 1726855280.82439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.85534: done with get_vars() 30582 1726855280.85561: variable 'ansible_search_path' from source: unknown 30582 1726855280.85563: variable 'ansible_search_path' from source: unknown 30582 1726855280.85608: variable 'ansible_search_path' from source: unknown 30582 1726855280.85610: variable 'ansible_search_path' from source: unknown 30582 1726855280.85638: we have included files to process 30582 1726855280.85639: generating all_blocks data 30582 1726855280.85641: done generating all_blocks data 30582 1726855280.85646: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30582 1726855280.85647: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30582 1726855280.85649: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30582 1726855280.86158: done processing included file 30582 1726855280.86160: iterating over new_blocks loaded from include file 30582 1726855280.86162: in VariableManager get_vars() 30582 1726855280.86177: done with get_vars() 30582 1726855280.86179: filtering new block on tags 30582 1726855280.86207: done filtering new block on tags 30582 1726855280.86209: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 => (item=tasks/delete_interface.yml) 30582 1726855280.86214: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855280.86215: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855280.86219: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855280.86304: in VariableManager get_vars() 30582 1726855280.86323: done with get_vars() 30582 1726855280.86409: done processing included file 30582 1726855280.86411: iterating over new_blocks loaded from include file 30582 1726855280.86412: in VariableManager get_vars() 30582 1726855280.86425: done with get_vars() 30582 1726855280.86427: filtering new block on tags 30582 1726855280.86457: done filtering new block on tags 30582 1726855280.86460: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item=tasks/assert_device_absent.yml) 30582 1726855280.86464: extending task lists for all hosts with included blocks 30582 1726855280.87043: done extending task lists 30582 1726855280.87044: done processing included files 30582 1726855280.87045: results queue empty 30582 1726855280.87046: checking for any_errors_fatal 30582 1726855280.87049: done checking for any_errors_fatal 30582 1726855280.87050: checking for max_fail_percentage 30582 1726855280.87051: done checking for max_fail_percentage 30582 1726855280.87052: checking to see if all hosts have failed and the running result is not ok 30582 1726855280.87053: done checking to see if all hosts have failed 30582 1726855280.87053: getting the remaining hosts for this loop 30582 1726855280.87054: done getting the remaining hosts for this loop 30582 1726855280.87057: getting the next task for host managed_node3 30582 1726855280.87061: done getting next task for host managed_node3 30582 1726855280.87063: ^ task is: TASK: Remove test interface if necessary 30582 1726855280.87066: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855280.87069: getting variables 30582 1726855280.87070: in VariableManager get_vars() 30582 1726855280.87084: Calling all_inventory to load vars for managed_node3 30582 1726855280.87090: Calling groups_inventory to load vars for managed_node3 30582 1726855280.87092: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855280.87099: Calling all_plugins_play to load vars for managed_node3 30582 1726855280.87101: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855280.87104: Calling groups_plugins_play to load vars for managed_node3 30582 1726855280.88409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855280.90098: done with get_vars() 30582 1726855280.90129: done getting variables 30582 1726855280.90178: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 14:01:20 -0400 (0:00:00.123) 0:00:17.252 ****** 30582 1726855280.90212: entering _queue_task() for managed_node3/command 30582 1726855280.90568: worker is 1 (out of 1 available) 30582 1726855280.90581: exiting _queue_task() for managed_node3/command 30582 1726855280.90698: done queuing things up, now waiting for results queue to drain 30582 1726855280.90701: waiting for pending results... 30582 1726855280.90913: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 30582 1726855280.91058: in run() - task 0affcc66-ac2b-aa83-7d57-00000000063e 30582 1726855280.91062: variable 'ansible_search_path' from source: unknown 30582 1726855280.91065: variable 'ansible_search_path' from source: unknown 30582 1726855280.91100: calling self._execute() 30582 1726855280.91175: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.91179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.91191: variable 'omit' from source: magic vars 30582 1726855280.91493: variable 'ansible_distribution_major_version' from source: facts 30582 1726855280.91502: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855280.91508: variable 'omit' from source: magic vars 30582 1726855280.91544: variable 'omit' from source: magic vars 30582 1726855280.91620: variable 'interface' from source: play vars 30582 1726855280.91635: variable 'omit' from source: magic vars 30582 1726855280.91666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855280.91699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855280.91716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855280.91731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.91741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855280.91764: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855280.91767: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.91769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.91849: Set connection var ansible_timeout to 10 30582 1726855280.91852: Set connection var ansible_connection to ssh 30582 1726855280.91858: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855280.91862: Set connection var ansible_pipelining to False 30582 1726855280.91867: Set connection var ansible_shell_executable to /bin/sh 30582 1726855280.91870: Set connection var ansible_shell_type to sh 30582 1726855280.91890: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.91894: variable 'ansible_connection' from source: unknown 30582 1726855280.91896: variable 'ansible_module_compression' from source: unknown 30582 1726855280.91898: variable 'ansible_shell_type' from source: unknown 30582 1726855280.91900: variable 'ansible_shell_executable' from source: unknown 30582 1726855280.91902: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855280.91912: variable 'ansible_pipelining' from source: unknown 30582 1726855280.91914: variable 'ansible_timeout' from source: unknown 30582 1726855280.91920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855280.92025: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855280.92033: variable 'omit' from source: magic vars 30582 1726855280.92039: starting attempt loop 30582 1726855280.92042: running the handler 30582 1726855280.92056: _low_level_execute_command(): starting 30582 1726855280.92063: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855280.92552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855280.92592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855280.92596: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.92599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855280.92602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.92648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855280.92651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855280.92653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855280.92727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855280.94411: stdout chunk (state=3): >>>/root <<< 30582 1726855280.94513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855280.94550: stderr chunk (state=3): >>><<< 30582 1726855280.94552: stdout chunk (state=3): >>><<< 30582 1726855280.94566: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855280.94592: _low_level_execute_command(): starting 30582 1726855280.94595: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643 `" && echo ansible-tmp-1726855280.945707-31352-27541103934643="` echo /root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643 `" ) && sleep 0' 30582 1726855280.95036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855280.95039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.95049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855280.95051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855280.95053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.95093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855280.95106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855280.95168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855280.97069: stdout chunk (state=3): >>>ansible-tmp-1726855280.945707-31352-27541103934643=/root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643 <<< 30582 1726855280.97173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855280.97201: stderr chunk (state=3): >>><<< 30582 1726855280.97204: stdout chunk (state=3): >>><<< 30582 1726855280.97219: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855280.945707-31352-27541103934643=/root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855280.97247: variable 'ansible_module_compression' from source: unknown 30582 1726855280.97290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855280.97320: variable 'ansible_facts' from source: unknown 30582 1726855280.97375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/AnsiballZ_command.py 30582 1726855280.97484: Sending initial data 30582 1726855280.97490: Sent initial data (154 bytes) 30582 1726855280.97924: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855280.97927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855280.97929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.97931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855280.97933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855280.97985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855280.97993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855280.98050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855280.99620: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855280.99672: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855280.99739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpw0sz2su1 /root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/AnsiballZ_command.py <<< 30582 1726855280.99742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/AnsiballZ_command.py" <<< 30582 1726855280.99794: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpw0sz2su1" to remote "/root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/AnsiballZ_command.py" <<< 30582 1726855281.00403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855281.00441: stderr chunk (state=3): >>><<< 30582 1726855281.00444: stdout chunk (state=3): >>><<< 30582 1726855281.00470: done transferring module to remote 30582 1726855281.00477: _low_level_execute_command(): starting 30582 1726855281.00484: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/ /root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/AnsiballZ_command.py && sleep 0' 30582 1726855281.00929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855281.00932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855281.00934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855281.00936: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855281.00938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855281.00944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855281.00984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855281.00990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855281.01060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855281.02800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855281.02823: stderr chunk (state=3): >>><<< 30582 1726855281.02826: stdout chunk (state=3): >>><<< 30582 1726855281.02840: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855281.02842: _low_level_execute_command(): starting 30582 1726855281.02847: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/AnsiballZ_command.py && sleep 0' 30582 1726855281.03255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855281.03285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855281.03291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855281.03293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855281.03297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855281.03299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855281.03345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855281.03349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855281.03427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855281.20076: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 14:01:21.192978", "end": "2024-09-20 14:01:21.199232", "delta": "0:00:00.006254", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855281.21594: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 30582 1726855281.21598: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855281.21602: stderr chunk (state=3): >>><<< 30582 1726855281.21605: stdout chunk (state=3): >>><<< 30582 1726855281.21608: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 14:01:21.192978", "end": "2024-09-20 14:01:21.199232", "delta": "0:00:00.006254", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855281.21613: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855281.21619: _low_level_execute_command(): starting 30582 1726855281.21628: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855280.945707-31352-27541103934643/ > /dev/null 2>&1 && sleep 0' 30582 1726855281.22316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855281.22391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855281.22421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855281.22437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855281.22595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855281.24431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855281.24528: stderr chunk (state=3): >>><<< 30582 1726855281.24542: stdout chunk (state=3): >>><<< 30582 1726855281.24897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855281.24901: handler run complete 30582 1726855281.24903: Evaluated conditional (False): False 30582 1726855281.24905: attempt loop complete, returning result 30582 1726855281.24907: _execute() done 30582 1726855281.24908: dumping result to json 30582 1726855281.24910: done dumping result, returning 30582 1726855281.24912: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [0affcc66-ac2b-aa83-7d57-00000000063e] 30582 1726855281.24914: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000063e 30582 1726855281.24981: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000063e 30582 1726855281.24985: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.006254", "end": "2024-09-20 14:01:21.199232", "rc": 1, "start": "2024-09-20 14:01:21.192978" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30582 1726855281.25066: no more pending results, returning what we have 30582 1726855281.25073: results queue empty 30582 1726855281.25074: checking for any_errors_fatal 30582 1726855281.25075: done checking for any_errors_fatal 30582 1726855281.25075: checking for max_fail_percentage 30582 1726855281.25077: done checking for max_fail_percentage 30582 1726855281.25079: checking to see if all hosts have failed and the running result is not ok 30582 1726855281.25079: done checking to see if all hosts have failed 30582 1726855281.25080: getting the remaining hosts for this loop 30582 1726855281.25081: done getting the remaining hosts for this loop 30582 1726855281.25086: getting the next task for host managed_node3 30582 1726855281.25098: done getting next task for host managed_node3 30582 1726855281.25101: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30582 1726855281.25110: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855281.25115: getting variables 30582 1726855281.25117: in VariableManager get_vars() 30582 1726855281.25150: Calling all_inventory to load vars for managed_node3 30582 1726855281.25152: Calling groups_inventory to load vars for managed_node3 30582 1726855281.25155: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855281.25168: Calling all_plugins_play to load vars for managed_node3 30582 1726855281.25170: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855281.25173: Calling groups_plugins_play to load vars for managed_node3 30582 1726855281.28166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855281.30615: done with get_vars() 30582 1726855281.30652: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 14:01:21 -0400 (0:00:00.407) 0:00:17.659 ****** 30582 1726855281.30951: entering _queue_task() for managed_node3/include_tasks 30582 1726855281.31574: worker is 1 (out of 1 available) 30582 1726855281.31589: exiting _queue_task() for managed_node3/include_tasks 30582 1726855281.31603: done queuing things up, now waiting for results queue to drain 30582 1726855281.31604: waiting for pending results... 30582 1726855281.31977: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30582 1726855281.32495: in run() - task 0affcc66-ac2b-aa83-7d57-000000000642 30582 1726855281.32499: variable 'ansible_search_path' from source: unknown 30582 1726855281.32502: variable 'ansible_search_path' from source: unknown 30582 1726855281.32505: calling self._execute() 30582 1726855281.32508: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855281.32510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855281.32513: variable 'omit' from source: magic vars 30582 1726855281.33493: variable 'ansible_distribution_major_version' from source: facts 30582 1726855281.33497: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855281.33501: _execute() done 30582 1726855281.33505: dumping result to json 30582 1726855281.33508: done dumping result, returning 30582 1726855281.33511: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-aa83-7d57-000000000642] 30582 1726855281.33514: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000642 30582 1726855281.33743: no more pending results, returning what we have 30582 1726855281.33749: in VariableManager get_vars() 30582 1726855281.33792: Calling all_inventory to load vars for managed_node3 30582 1726855281.33796: Calling groups_inventory to load vars for managed_node3 30582 1726855281.33800: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855281.33927: Calling all_plugins_play to load vars for managed_node3 30582 1726855281.33930: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855281.33934: Calling groups_plugins_play to load vars for managed_node3 30582 1726855281.34627: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000642 30582 1726855281.34631: WORKER PROCESS EXITING 30582 1726855281.40245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855281.41901: done with get_vars() 30582 1726855281.41950: variable 'ansible_search_path' from source: unknown 30582 1726855281.41952: variable 'ansible_search_path' from source: unknown 30582 1726855281.41962: variable 'item' from source: include params 30582 1726855281.42121: variable 'item' from source: include params 30582 1726855281.42405: we have included files to process 30582 1726855281.42407: generating all_blocks data 30582 1726855281.42409: done generating all_blocks data 30582 1726855281.42411: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855281.42412: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855281.42415: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855281.42602: done processing included file 30582 1726855281.42604: iterating over new_blocks loaded from include file 30582 1726855281.42606: in VariableManager get_vars() 30582 1726855281.42705: done with get_vars() 30582 1726855281.42708: filtering new block on tags 30582 1726855281.42742: done filtering new block on tags 30582 1726855281.42744: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30582 1726855281.42748: extending task lists for all hosts with included blocks 30582 1726855281.42908: done extending task lists 30582 1726855281.42910: done processing included files 30582 1726855281.42911: results queue empty 30582 1726855281.42911: checking for any_errors_fatal 30582 1726855281.42915: done checking for any_errors_fatal 30582 1726855281.42916: checking for max_fail_percentage 30582 1726855281.42917: done checking for max_fail_percentage 30582 1726855281.42918: checking to see if all hosts have failed and the running result is not ok 30582 1726855281.42918: done checking to see if all hosts have failed 30582 1726855281.42919: getting the remaining hosts for this loop 30582 1726855281.42920: done getting the remaining hosts for this loop 30582 1726855281.42922: getting the next task for host managed_node3 30582 1726855281.42927: done getting next task for host managed_node3 30582 1726855281.42929: ^ task is: TASK: Get stat for interface {{ interface }} 30582 1726855281.42932: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855281.42934: getting variables 30582 1726855281.42935: in VariableManager get_vars() 30582 1726855281.42950: Calling all_inventory to load vars for managed_node3 30582 1726855281.42952: Calling groups_inventory to load vars for managed_node3 30582 1726855281.42955: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855281.42960: Calling all_plugins_play to load vars for managed_node3 30582 1726855281.42962: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855281.42964: Calling groups_plugins_play to load vars for managed_node3 30582 1726855281.44243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855281.46055: done with get_vars() 30582 1726855281.46078: done getting variables 30582 1726855281.46277: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 14:01:21 -0400 (0:00:00.153) 0:00:17.813 ****** 30582 1726855281.46334: entering _queue_task() for managed_node3/stat 30582 1726855281.47022: worker is 1 (out of 1 available) 30582 1726855281.47035: exiting _queue_task() for managed_node3/stat 30582 1726855281.47047: done queuing things up, now waiting for results queue to drain 30582 1726855281.47054: waiting for pending results... 30582 1726855281.47281: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30582 1726855281.47501: in run() - task 0affcc66-ac2b-aa83-7d57-000000000691 30582 1726855281.47615: variable 'ansible_search_path' from source: unknown 30582 1726855281.47618: variable 'ansible_search_path' from source: unknown 30582 1726855281.47656: calling self._execute() 30582 1726855281.47816: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855281.47820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855281.47829: variable 'omit' from source: magic vars 30582 1726855281.48893: variable 'ansible_distribution_major_version' from source: facts 30582 1726855281.48897: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855281.48900: variable 'omit' from source: magic vars 30582 1726855281.48902: variable 'omit' from source: magic vars 30582 1726855281.49075: variable 'interface' from source: play vars 30582 1726855281.49097: variable 'omit' from source: magic vars 30582 1726855281.49200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855281.49258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855281.49280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855281.49304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855281.49318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855281.49352: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855281.49355: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855281.49358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855281.49475: Set connection var ansible_timeout to 10 30582 1726855281.49478: Set connection var ansible_connection to ssh 30582 1726855281.49491: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855281.49498: Set connection var ansible_pipelining to False 30582 1726855281.49504: Set connection var ansible_shell_executable to /bin/sh 30582 1726855281.49507: Set connection var ansible_shell_type to sh 30582 1726855281.49569: variable 'ansible_shell_executable' from source: unknown 30582 1726855281.49573: variable 'ansible_connection' from source: unknown 30582 1726855281.49575: variable 'ansible_module_compression' from source: unknown 30582 1726855281.49577: variable 'ansible_shell_type' from source: unknown 30582 1726855281.49580: variable 'ansible_shell_executable' from source: unknown 30582 1726855281.49581: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855281.49583: variable 'ansible_pipelining' from source: unknown 30582 1726855281.49585: variable 'ansible_timeout' from source: unknown 30582 1726855281.49591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855281.49755: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855281.49766: variable 'omit' from source: magic vars 30582 1726855281.49785: starting attempt loop 30582 1726855281.49790: running the handler 30582 1726855281.49799: _low_level_execute_command(): starting 30582 1726855281.49895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855281.50522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855281.50606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855281.50637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855281.50661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855281.50674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855281.50773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855281.52792: stdout chunk (state=3): >>>/root <<< 30582 1726855281.52797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855281.52799: stdout chunk (state=3): >>><<< 30582 1726855281.52802: stderr chunk (state=3): >>><<< 30582 1726855281.52805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855281.52809: _low_level_execute_command(): starting 30582 1726855281.52812: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619 `" && echo ansible-tmp-1726855281.5269568-31375-151284143946619="` echo /root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619 `" ) && sleep 0' 30582 1726855281.53413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855281.53446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855281.53470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855281.53498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855281.53547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855281.53730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855281.53761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855281.53977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855281.54061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855281.56053: stdout chunk (state=3): >>>ansible-tmp-1726855281.5269568-31375-151284143946619=/root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619 <<< 30582 1726855281.56103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855281.56166: stderr chunk (state=3): >>><<< 30582 1726855281.56176: stdout chunk (state=3): >>><<< 30582 1726855281.56203: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855281.5269568-31375-151284143946619=/root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855281.56414: variable 'ansible_module_compression' from source: unknown 30582 1726855281.56418: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855281.56420: variable 'ansible_facts' from source: unknown 30582 1726855281.56498: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/AnsiballZ_stat.py 30582 1726855281.56717: Sending initial data 30582 1726855281.56721: Sent initial data (153 bytes) 30582 1726855281.57393: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855281.57413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855281.57432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855281.57524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855281.59083: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855281.59102: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30582 1726855281.59125: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855281.59215: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855281.59271: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpwbaclyw6 /root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/AnsiballZ_stat.py <<< 30582 1726855281.59299: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/AnsiballZ_stat.py" <<< 30582 1726855281.59352: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpwbaclyw6" to remote "/root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/AnsiballZ_stat.py" <<< 30582 1726855281.60298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855281.60385: stderr chunk (state=3): >>><<< 30582 1726855281.60390: stdout chunk (state=3): >>><<< 30582 1726855281.60500: done transferring module to remote 30582 1726855281.60504: _low_level_execute_command(): starting 30582 1726855281.60506: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/ /root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/AnsiballZ_stat.py && sleep 0' 30582 1726855281.61184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855281.61220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855281.61397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855281.61511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855281.61597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855281.63373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855281.63428: stderr chunk (state=3): >>><<< 30582 1726855281.63443: stdout chunk (state=3): >>><<< 30582 1726855281.63540: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855281.63544: _low_level_execute_command(): starting 30582 1726855281.63547: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/AnsiballZ_stat.py && sleep 0' 30582 1726855281.64059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855281.64073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855281.64095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855281.64116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855281.64133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855281.64146: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855281.64161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855281.64183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855281.64207: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855281.64298: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855281.64328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855281.64426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855281.79490: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855281.81182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855281.81450: stdout chunk (state=3): >>><<< 30582 1726855281.81454: stderr chunk (state=3): >>><<< 30582 1726855281.81456: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855281.81460: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855281.81462: _low_level_execute_command(): starting 30582 1726855281.81465: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855281.5269568-31375-151284143946619/ > /dev/null 2>&1 && sleep 0' 30582 1726855281.82075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855281.82109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855281.82203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855281.82224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855281.82251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855281.82438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855281.84212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855281.84323: stderr chunk (state=3): >>><<< 30582 1726855281.84591: stdout chunk (state=3): >>><<< 30582 1726855281.84595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855281.84598: handler run complete 30582 1726855281.84600: attempt loop complete, returning result 30582 1726855281.84602: _execute() done 30582 1726855281.84604: dumping result to json 30582 1726855281.84606: done dumping result, returning 30582 1726855281.84608: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcc66-ac2b-aa83-7d57-000000000691] 30582 1726855281.84610: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000691 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855281.84786: no more pending results, returning what we have 30582 1726855281.84792: results queue empty 30582 1726855281.84793: checking for any_errors_fatal 30582 1726855281.84794: done checking for any_errors_fatal 30582 1726855281.84795: checking for max_fail_percentage 30582 1726855281.84797: done checking for max_fail_percentage 30582 1726855281.84798: checking to see if all hosts have failed and the running result is not ok 30582 1726855281.84799: done checking to see if all hosts have failed 30582 1726855281.84799: getting the remaining hosts for this loop 30582 1726855281.84801: done getting the remaining hosts for this loop 30582 1726855281.84804: getting the next task for host managed_node3 30582 1726855281.84814: done getting next task for host managed_node3 30582 1726855281.84817: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30582 1726855281.84821: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855281.84825: getting variables 30582 1726855281.84827: in VariableManager get_vars() 30582 1726855281.84858: Calling all_inventory to load vars for managed_node3 30582 1726855281.84860: Calling groups_inventory to load vars for managed_node3 30582 1726855281.84863: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855281.84874: Calling all_plugins_play to load vars for managed_node3 30582 1726855281.84876: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855281.84879: Calling groups_plugins_play to load vars for managed_node3 30582 1726855281.86196: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000691 30582 1726855281.86200: WORKER PROCESS EXITING 30582 1726855281.88173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855281.91828: done with get_vars() 30582 1726855281.91859: done getting variables 30582 1726855281.91932: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855281.92238: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 14:01:21 -0400 (0:00:00.459) 0:00:18.272 ****** 30582 1726855281.92273: entering _queue_task() for managed_node3/assert 30582 1726855281.93144: worker is 1 (out of 1 available) 30582 1726855281.93158: exiting _queue_task() for managed_node3/assert 30582 1726855281.93170: done queuing things up, now waiting for results queue to drain 30582 1726855281.93172: waiting for pending results... 30582 1726855281.93673: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30582 1726855281.94033: in run() - task 0affcc66-ac2b-aa83-7d57-000000000643 30582 1726855281.94055: variable 'ansible_search_path' from source: unknown 30582 1726855281.94065: variable 'ansible_search_path' from source: unknown 30582 1726855281.94158: calling self._execute() 30582 1726855281.94374: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855281.94394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855281.94410: variable 'omit' from source: magic vars 30582 1726855281.95260: variable 'ansible_distribution_major_version' from source: facts 30582 1726855281.95279: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855281.95298: variable 'omit' from source: magic vars 30582 1726855281.95348: variable 'omit' from source: magic vars 30582 1726855281.95594: variable 'interface' from source: play vars 30582 1726855281.95619: variable 'omit' from source: magic vars 30582 1726855281.95665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855281.95894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855281.95898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855281.95900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855281.95903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855281.96021: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855281.96031: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855281.96040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855281.96248: Set connection var ansible_timeout to 10 30582 1726855281.96257: Set connection var ansible_connection to ssh 30582 1726855281.96272: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855281.96336: Set connection var ansible_pipelining to False 30582 1726855281.96347: Set connection var ansible_shell_executable to /bin/sh 30582 1726855281.96355: Set connection var ansible_shell_type to sh 30582 1726855281.96390: variable 'ansible_shell_executable' from source: unknown 30582 1726855281.96442: variable 'ansible_connection' from source: unknown 30582 1726855281.96450: variable 'ansible_module_compression' from source: unknown 30582 1726855281.96457: variable 'ansible_shell_type' from source: unknown 30582 1726855281.96464: variable 'ansible_shell_executable' from source: unknown 30582 1726855281.96470: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855281.96478: variable 'ansible_pipelining' from source: unknown 30582 1726855281.96491: variable 'ansible_timeout' from source: unknown 30582 1726855281.96501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855281.96759: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855281.97094: variable 'omit' from source: magic vars 30582 1726855281.97097: starting attempt loop 30582 1726855281.97100: running the handler 30582 1726855281.97170: variable 'interface_stat' from source: set_fact 30582 1726855281.97493: Evaluated conditional (not interface_stat.stat.exists): True 30582 1726855281.97497: handler run complete 30582 1726855281.97499: attempt loop complete, returning result 30582 1726855281.97501: _execute() done 30582 1726855281.97504: dumping result to json 30582 1726855281.97506: done dumping result, returning 30582 1726855281.97508: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcc66-ac2b-aa83-7d57-000000000643] 30582 1726855281.97510: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000643 30582 1726855281.97576: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000643 30582 1726855281.97582: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855281.97647: no more pending results, returning what we have 30582 1726855281.97651: results queue empty 30582 1726855281.97653: checking for any_errors_fatal 30582 1726855281.97660: done checking for any_errors_fatal 30582 1726855281.97661: checking for max_fail_percentage 30582 1726855281.97664: done checking for max_fail_percentage 30582 1726855281.97665: checking to see if all hosts have failed and the running result is not ok 30582 1726855281.97665: done checking to see if all hosts have failed 30582 1726855281.97666: getting the remaining hosts for this loop 30582 1726855281.97668: done getting the remaining hosts for this loop 30582 1726855281.97672: getting the next task for host managed_node3 30582 1726855281.97685: done getting next task for host managed_node3 30582 1726855281.97691: ^ task is: TASK: Test 30582 1726855281.97694: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855281.97700: getting variables 30582 1726855281.97702: in VariableManager get_vars() 30582 1726855281.97736: Calling all_inventory to load vars for managed_node3 30582 1726855281.97739: Calling groups_inventory to load vars for managed_node3 30582 1726855281.97743: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855281.97755: Calling all_plugins_play to load vars for managed_node3 30582 1726855281.97759: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855281.97763: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.01005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.04218: done with get_vars() 30582 1726855282.04247: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 14:01:22 -0400 (0:00:00.122) 0:00:18.395 ****** 30582 1726855282.04560: entering _queue_task() for managed_node3/include_tasks 30582 1726855282.05353: worker is 1 (out of 1 available) 30582 1726855282.05368: exiting _queue_task() for managed_node3/include_tasks 30582 1726855282.05384: done queuing things up, now waiting for results queue to drain 30582 1726855282.05386: waiting for pending results... 30582 1726855282.05986: running TaskExecutor() for managed_node3/TASK: Test 30582 1726855282.06099: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005b8 30582 1726855282.06211: variable 'ansible_search_path' from source: unknown 30582 1726855282.06219: variable 'ansible_search_path' from source: unknown 30582 1726855282.06512: variable 'lsr_test' from source: include params 30582 1726855282.06716: variable 'lsr_test' from source: include params 30582 1726855282.06865: variable 'omit' from source: magic vars 30582 1726855282.07194: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855282.07210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855282.07225: variable 'omit' from source: magic vars 30582 1726855282.07817: variable 'ansible_distribution_major_version' from source: facts 30582 1726855282.07835: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855282.07847: variable 'item' from source: unknown 30582 1726855282.08004: variable 'item' from source: unknown 30582 1726855282.08048: variable 'item' from source: unknown 30582 1726855282.08357: variable 'item' from source: unknown 30582 1726855282.08583: dumping result to json 30582 1726855282.08588: done dumping result, returning 30582 1726855282.08591: done running TaskExecutor() for managed_node3/TASK: Test [0affcc66-ac2b-aa83-7d57-0000000005b8] 30582 1726855282.08594: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b8 30582 1726855282.08636: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b8 30582 1726855282.08639: WORKER PROCESS EXITING 30582 1726855282.08661: no more pending results, returning what we have 30582 1726855282.08666: in VariableManager get_vars() 30582 1726855282.08711: Calling all_inventory to load vars for managed_node3 30582 1726855282.08714: Calling groups_inventory to load vars for managed_node3 30582 1726855282.08718: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855282.08733: Calling all_plugins_play to load vars for managed_node3 30582 1726855282.08736: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855282.08739: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.11809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.15042: done with get_vars() 30582 1726855282.15068: variable 'ansible_search_path' from source: unknown 30582 1726855282.15070: variable 'ansible_search_path' from source: unknown 30582 1726855282.15110: we have included files to process 30582 1726855282.15112: generating all_blocks data 30582 1726855282.15113: done generating all_blocks data 30582 1726855282.15118: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30582 1726855282.15119: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30582 1726855282.15121: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30582 1726855282.15855: done processing included file 30582 1726855282.15858: iterating over new_blocks loaded from include file 30582 1726855282.15860: in VariableManager get_vars() 30582 1726855282.15881: done with get_vars() 30582 1726855282.15883: filtering new block on tags 30582 1726855282.15924: done filtering new block on tags 30582 1726855282.15927: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml for managed_node3 => (item=tasks/create_bridge_profile_no_autoconnect.yml) 30582 1726855282.15933: extending task lists for all hosts with included blocks 30582 1726855282.17847: done extending task lists 30582 1726855282.17849: done processing included files 30582 1726855282.17849: results queue empty 30582 1726855282.17850: checking for any_errors_fatal 30582 1726855282.17854: done checking for any_errors_fatal 30582 1726855282.17855: checking for max_fail_percentage 30582 1726855282.17856: done checking for max_fail_percentage 30582 1726855282.17857: checking to see if all hosts have failed and the running result is not ok 30582 1726855282.17857: done checking to see if all hosts have failed 30582 1726855282.17858: getting the remaining hosts for this loop 30582 1726855282.17859: done getting the remaining hosts for this loop 30582 1726855282.17862: getting the next task for host managed_node3 30582 1726855282.17866: done getting next task for host managed_node3 30582 1726855282.17868: ^ task is: TASK: Include network role 30582 1726855282.17871: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855282.17873: getting variables 30582 1726855282.17874: in VariableManager get_vars() 30582 1726855282.17890: Calling all_inventory to load vars for managed_node3 30582 1726855282.17892: Calling groups_inventory to load vars for managed_node3 30582 1726855282.17895: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855282.17901: Calling all_plugins_play to load vars for managed_node3 30582 1726855282.17903: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855282.17905: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.20626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.23837: done with get_vars() 30582 1726855282.23869: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:3 Friday 20 September 2024 14:01:22 -0400 (0:00:00.193) 0:00:18.589 ****** 30582 1726855282.23952: entering _queue_task() for managed_node3/include_role 30582 1726855282.24725: worker is 1 (out of 1 available) 30582 1726855282.24736: exiting _queue_task() for managed_node3/include_role 30582 1726855282.24749: done queuing things up, now waiting for results queue to drain 30582 1726855282.24750: waiting for pending results... 30582 1726855282.25606: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855282.25794: in run() - task 0affcc66-ac2b-aa83-7d57-0000000006b1 30582 1726855282.25799: variable 'ansible_search_path' from source: unknown 30582 1726855282.25802: variable 'ansible_search_path' from source: unknown 30582 1726855282.25804: calling self._execute() 30582 1726855282.25806: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855282.25809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855282.25811: variable 'omit' from source: magic vars 30582 1726855282.26406: variable 'ansible_distribution_major_version' from source: facts 30582 1726855282.26424: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855282.26435: _execute() done 30582 1726855282.26444: dumping result to json 30582 1726855282.26451: done dumping result, returning 30582 1726855282.26463: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-0000000006b1] 30582 1726855282.26473: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000006b1 30582 1726855282.26645: no more pending results, returning what we have 30582 1726855282.26651: in VariableManager get_vars() 30582 1726855282.26695: Calling all_inventory to load vars for managed_node3 30582 1726855282.26698: Calling groups_inventory to load vars for managed_node3 30582 1726855282.26701: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855282.26709: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000006b1 30582 1726855282.26711: WORKER PROCESS EXITING 30582 1726855282.26803: Calling all_plugins_play to load vars for managed_node3 30582 1726855282.26806: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855282.26810: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.29481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.31628: done with get_vars() 30582 1726855282.31648: variable 'ansible_search_path' from source: unknown 30582 1726855282.31649: variable 'ansible_search_path' from source: unknown 30582 1726855282.31848: variable 'omit' from source: magic vars 30582 1726855282.31893: variable 'omit' from source: magic vars 30582 1726855282.31912: variable 'omit' from source: magic vars 30582 1726855282.31916: we have included files to process 30582 1726855282.31917: generating all_blocks data 30582 1726855282.31919: done generating all_blocks data 30582 1726855282.31920: processing included file: fedora.linux_system_roles.network 30582 1726855282.31940: in VariableManager get_vars() 30582 1726855282.31954: done with get_vars() 30582 1726855282.31979: in VariableManager get_vars() 30582 1726855282.31995: done with get_vars() 30582 1726855282.32038: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855282.32160: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855282.32242: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855282.33060: in VariableManager get_vars() 30582 1726855282.33195: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855282.35977: iterating over new_blocks loaded from include file 30582 1726855282.35980: in VariableManager get_vars() 30582 1726855282.36003: done with get_vars() 30582 1726855282.36005: filtering new block on tags 30582 1726855282.36318: done filtering new block on tags 30582 1726855282.36323: in VariableManager get_vars() 30582 1726855282.36339: done with get_vars() 30582 1726855282.36340: filtering new block on tags 30582 1726855282.36357: done filtering new block on tags 30582 1726855282.36358: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855282.36364: extending task lists for all hosts with included blocks 30582 1726855282.36541: done extending task lists 30582 1726855282.36543: done processing included files 30582 1726855282.36543: results queue empty 30582 1726855282.36544: checking for any_errors_fatal 30582 1726855282.36549: done checking for any_errors_fatal 30582 1726855282.36550: checking for max_fail_percentage 30582 1726855282.36551: done checking for max_fail_percentage 30582 1726855282.36552: checking to see if all hosts have failed and the running result is not ok 30582 1726855282.36553: done checking to see if all hosts have failed 30582 1726855282.36553: getting the remaining hosts for this loop 30582 1726855282.36554: done getting the remaining hosts for this loop 30582 1726855282.36557: getting the next task for host managed_node3 30582 1726855282.36561: done getting next task for host managed_node3 30582 1726855282.36564: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855282.36567: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855282.36576: getting variables 30582 1726855282.36577: in VariableManager get_vars() 30582 1726855282.36592: Calling all_inventory to load vars for managed_node3 30582 1726855282.36594: Calling groups_inventory to load vars for managed_node3 30582 1726855282.36596: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855282.36602: Calling all_plugins_play to load vars for managed_node3 30582 1726855282.36605: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855282.36608: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.37840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.39444: done with get_vars() 30582 1726855282.39472: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:01:22 -0400 (0:00:00.156) 0:00:18.745 ****** 30582 1726855282.39562: entering _queue_task() for managed_node3/include_tasks 30582 1726855282.40111: worker is 1 (out of 1 available) 30582 1726855282.40122: exiting _queue_task() for managed_node3/include_tasks 30582 1726855282.40132: done queuing things up, now waiting for results queue to drain 30582 1726855282.40133: waiting for pending results... 30582 1726855282.40259: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855282.40403: in run() - task 0affcc66-ac2b-aa83-7d57-00000000072f 30582 1726855282.40422: variable 'ansible_search_path' from source: unknown 30582 1726855282.40430: variable 'ansible_search_path' from source: unknown 30582 1726855282.40475: calling self._execute() 30582 1726855282.40566: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855282.40585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855282.40603: variable 'omit' from source: magic vars 30582 1726855282.40976: variable 'ansible_distribution_major_version' from source: facts 30582 1726855282.40997: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855282.41015: _execute() done 30582 1726855282.41024: dumping result to json 30582 1726855282.41033: done dumping result, returning 30582 1726855282.41123: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-00000000072f] 30582 1726855282.41127: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000072f 30582 1726855282.41199: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000072f 30582 1726855282.41202: WORKER PROCESS EXITING 30582 1726855282.41268: no more pending results, returning what we have 30582 1726855282.41275: in VariableManager get_vars() 30582 1726855282.41320: Calling all_inventory to load vars for managed_node3 30582 1726855282.41324: Calling groups_inventory to load vars for managed_node3 30582 1726855282.41327: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855282.41341: Calling all_plugins_play to load vars for managed_node3 30582 1726855282.41345: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855282.41348: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.43061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.44657: done with get_vars() 30582 1726855282.44690: variable 'ansible_search_path' from source: unknown 30582 1726855282.44691: variable 'ansible_search_path' from source: unknown 30582 1726855282.44734: we have included files to process 30582 1726855282.44735: generating all_blocks data 30582 1726855282.44737: done generating all_blocks data 30582 1726855282.44740: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855282.44741: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855282.44744: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855282.45363: done processing included file 30582 1726855282.45365: iterating over new_blocks loaded from include file 30582 1726855282.45367: in VariableManager get_vars() 30582 1726855282.45394: done with get_vars() 30582 1726855282.45396: filtering new block on tags 30582 1726855282.45430: done filtering new block on tags 30582 1726855282.45433: in VariableManager get_vars() 30582 1726855282.45461: done with get_vars() 30582 1726855282.45463: filtering new block on tags 30582 1726855282.45512: done filtering new block on tags 30582 1726855282.45515: in VariableManager get_vars() 30582 1726855282.45537: done with get_vars() 30582 1726855282.45539: filtering new block on tags 30582 1726855282.45586: done filtering new block on tags 30582 1726855282.45590: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855282.45595: extending task lists for all hosts with included blocks 30582 1726855282.47327: done extending task lists 30582 1726855282.47329: done processing included files 30582 1726855282.47330: results queue empty 30582 1726855282.47331: checking for any_errors_fatal 30582 1726855282.47334: done checking for any_errors_fatal 30582 1726855282.47334: checking for max_fail_percentage 30582 1726855282.47336: done checking for max_fail_percentage 30582 1726855282.47336: checking to see if all hosts have failed and the running result is not ok 30582 1726855282.47337: done checking to see if all hosts have failed 30582 1726855282.47338: getting the remaining hosts for this loop 30582 1726855282.47339: done getting the remaining hosts for this loop 30582 1726855282.47342: getting the next task for host managed_node3 30582 1726855282.47348: done getting next task for host managed_node3 30582 1726855282.47351: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855282.47354: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855282.47366: getting variables 30582 1726855282.47367: in VariableManager get_vars() 30582 1726855282.47384: Calling all_inventory to load vars for managed_node3 30582 1726855282.47390: Calling groups_inventory to load vars for managed_node3 30582 1726855282.47393: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855282.47403: Calling all_plugins_play to load vars for managed_node3 30582 1726855282.47406: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855282.47409: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.48695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.50274: done with get_vars() 30582 1726855282.50304: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:01:22 -0400 (0:00:00.108) 0:00:18.853 ****** 30582 1726855282.50391: entering _queue_task() for managed_node3/setup 30582 1726855282.50752: worker is 1 (out of 1 available) 30582 1726855282.50765: exiting _queue_task() for managed_node3/setup 30582 1726855282.50776: done queuing things up, now waiting for results queue to drain 30582 1726855282.50778: waiting for pending results... 30582 1726855282.51118: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855282.51323: in run() - task 0affcc66-ac2b-aa83-7d57-00000000078c 30582 1726855282.51327: variable 'ansible_search_path' from source: unknown 30582 1726855282.51330: variable 'ansible_search_path' from source: unknown 30582 1726855282.51333: calling self._execute() 30582 1726855282.51421: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855282.51798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855282.51802: variable 'omit' from source: magic vars 30582 1726855282.52358: variable 'ansible_distribution_major_version' from source: facts 30582 1726855282.52376: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855282.52760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855282.55094: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855282.55159: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855282.55207: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855282.55246: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855282.55272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855282.55348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855282.55376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855282.55401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855282.55438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855282.55451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855282.55510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855282.55534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855282.55559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855282.55613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855282.55623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855282.55785: variable '__network_required_facts' from source: role '' defaults 30582 1726855282.55814: variable 'ansible_facts' from source: unknown 30582 1726855282.56641: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855282.56645: when evaluation is False, skipping this task 30582 1726855282.56648: _execute() done 30582 1726855282.56651: dumping result to json 30582 1726855282.56653: done dumping result, returning 30582 1726855282.56659: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-00000000078c] 30582 1726855282.56663: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000078c 30582 1726855282.56756: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000078c 30582 1726855282.56760: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855282.56815: no more pending results, returning what we have 30582 1726855282.56819: results queue empty 30582 1726855282.56820: checking for any_errors_fatal 30582 1726855282.56821: done checking for any_errors_fatal 30582 1726855282.56822: checking for max_fail_percentage 30582 1726855282.56824: done checking for max_fail_percentage 30582 1726855282.56825: checking to see if all hosts have failed and the running result is not ok 30582 1726855282.56826: done checking to see if all hosts have failed 30582 1726855282.56827: getting the remaining hosts for this loop 30582 1726855282.56990: done getting the remaining hosts for this loop 30582 1726855282.56994: getting the next task for host managed_node3 30582 1726855282.57005: done getting next task for host managed_node3 30582 1726855282.57008: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855282.57013: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855282.57029: getting variables 30582 1726855282.57030: in VariableManager get_vars() 30582 1726855282.57062: Calling all_inventory to load vars for managed_node3 30582 1726855282.57065: Calling groups_inventory to load vars for managed_node3 30582 1726855282.57068: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855282.57077: Calling all_plugins_play to load vars for managed_node3 30582 1726855282.57080: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855282.57092: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.57863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.59586: done with get_vars() 30582 1726855282.59625: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:01:22 -0400 (0:00:00.093) 0:00:18.947 ****** 30582 1726855282.59708: entering _queue_task() for managed_node3/stat 30582 1726855282.60030: worker is 1 (out of 1 available) 30582 1726855282.60046: exiting _queue_task() for managed_node3/stat 30582 1726855282.60058: done queuing things up, now waiting for results queue to drain 30582 1726855282.60060: waiting for pending results... 30582 1726855282.60425: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855282.60471: in run() - task 0affcc66-ac2b-aa83-7d57-00000000078e 30582 1726855282.60496: variable 'ansible_search_path' from source: unknown 30582 1726855282.60505: variable 'ansible_search_path' from source: unknown 30582 1726855282.60552: calling self._execute() 30582 1726855282.60647: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855282.60658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855282.60673: variable 'omit' from source: magic vars 30582 1726855282.61115: variable 'ansible_distribution_major_version' from source: facts 30582 1726855282.61128: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855282.61265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855282.61467: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855282.61503: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855282.61529: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855282.61555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855282.61624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855282.61642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855282.61660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855282.61679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855282.61742: variable '__network_is_ostree' from source: set_fact 30582 1726855282.61748: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855282.61751: when evaluation is False, skipping this task 30582 1726855282.61753: _execute() done 30582 1726855282.61756: dumping result to json 30582 1726855282.61761: done dumping result, returning 30582 1726855282.61768: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-00000000078e] 30582 1726855282.61772: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000078e 30582 1726855282.61856: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000078e 30582 1726855282.61859: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855282.61910: no more pending results, returning what we have 30582 1726855282.61914: results queue empty 30582 1726855282.61915: checking for any_errors_fatal 30582 1726855282.61922: done checking for any_errors_fatal 30582 1726855282.61922: checking for max_fail_percentage 30582 1726855282.61924: done checking for max_fail_percentage 30582 1726855282.61925: checking to see if all hosts have failed and the running result is not ok 30582 1726855282.61926: done checking to see if all hosts have failed 30582 1726855282.61927: getting the remaining hosts for this loop 30582 1726855282.61928: done getting the remaining hosts for this loop 30582 1726855282.61932: getting the next task for host managed_node3 30582 1726855282.61941: done getting next task for host managed_node3 30582 1726855282.61944: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855282.61949: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855282.61964: getting variables 30582 1726855282.61966: in VariableManager get_vars() 30582 1726855282.62002: Calling all_inventory to load vars for managed_node3 30582 1726855282.62005: Calling groups_inventory to load vars for managed_node3 30582 1726855282.62007: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855282.62017: Calling all_plugins_play to load vars for managed_node3 30582 1726855282.62019: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855282.62022: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.62943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.64282: done with get_vars() 30582 1726855282.64306: done getting variables 30582 1726855282.64350: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:01:22 -0400 (0:00:00.046) 0:00:18.993 ****** 30582 1726855282.64377: entering _queue_task() for managed_node3/set_fact 30582 1726855282.64634: worker is 1 (out of 1 available) 30582 1726855282.64649: exiting _queue_task() for managed_node3/set_fact 30582 1726855282.64661: done queuing things up, now waiting for results queue to drain 30582 1726855282.64662: waiting for pending results... 30582 1726855282.65008: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855282.65017: in run() - task 0affcc66-ac2b-aa83-7d57-00000000078f 30582 1726855282.65023: variable 'ansible_search_path' from source: unknown 30582 1726855282.65027: variable 'ansible_search_path' from source: unknown 30582 1726855282.65065: calling self._execute() 30582 1726855282.65159: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855282.65162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855282.65172: variable 'omit' from source: magic vars 30582 1726855282.65532: variable 'ansible_distribution_major_version' from source: facts 30582 1726855282.65547: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855282.65709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855282.65972: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855282.66021: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855282.66052: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855282.66292: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855282.66296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855282.66299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855282.66301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855282.66304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855282.66324: variable '__network_is_ostree' from source: set_fact 30582 1726855282.66334: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855282.66337: when evaluation is False, skipping this task 30582 1726855282.66340: _execute() done 30582 1726855282.66342: dumping result to json 30582 1726855282.66347: done dumping result, returning 30582 1726855282.66356: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-00000000078f] 30582 1726855282.66360: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000078f skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855282.66496: no more pending results, returning what we have 30582 1726855282.66501: results queue empty 30582 1726855282.66502: checking for any_errors_fatal 30582 1726855282.66512: done checking for any_errors_fatal 30582 1726855282.66513: checking for max_fail_percentage 30582 1726855282.66515: done checking for max_fail_percentage 30582 1726855282.66516: checking to see if all hosts have failed and the running result is not ok 30582 1726855282.66517: done checking to see if all hosts have failed 30582 1726855282.66518: getting the remaining hosts for this loop 30582 1726855282.66519: done getting the remaining hosts for this loop 30582 1726855282.66523: getting the next task for host managed_node3 30582 1726855282.66536: done getting next task for host managed_node3 30582 1726855282.66539: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855282.66544: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855282.66560: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000078f 30582 1726855282.66563: WORKER PROCESS EXITING 30582 1726855282.66624: getting variables 30582 1726855282.66626: in VariableManager get_vars() 30582 1726855282.66662: Calling all_inventory to load vars for managed_node3 30582 1726855282.66665: Calling groups_inventory to load vars for managed_node3 30582 1726855282.66668: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855282.66681: Calling all_plugins_play to load vars for managed_node3 30582 1726855282.66684: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855282.66686: Calling groups_plugins_play to load vars for managed_node3 30582 1726855282.67609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855282.68481: done with get_vars() 30582 1726855282.68502: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:01:22 -0400 (0:00:00.041) 0:00:19.035 ****** 30582 1726855282.68574: entering _queue_task() for managed_node3/service_facts 30582 1726855282.68834: worker is 1 (out of 1 available) 30582 1726855282.68849: exiting _queue_task() for managed_node3/service_facts 30582 1726855282.68863: done queuing things up, now waiting for results queue to drain 30582 1726855282.68865: waiting for pending results... 30582 1726855282.69050: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855282.69149: in run() - task 0affcc66-ac2b-aa83-7d57-000000000791 30582 1726855282.69160: variable 'ansible_search_path' from source: unknown 30582 1726855282.69164: variable 'ansible_search_path' from source: unknown 30582 1726855282.69198: calling self._execute() 30582 1726855282.69265: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855282.69269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855282.69278: variable 'omit' from source: magic vars 30582 1726855282.69551: variable 'ansible_distribution_major_version' from source: facts 30582 1726855282.69560: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855282.69566: variable 'omit' from source: magic vars 30582 1726855282.69618: variable 'omit' from source: magic vars 30582 1726855282.69643: variable 'omit' from source: magic vars 30582 1726855282.69673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855282.69705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855282.69721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855282.69734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855282.69745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855282.69769: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855282.69772: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855282.69775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855282.69849: Set connection var ansible_timeout to 10 30582 1726855282.69852: Set connection var ansible_connection to ssh 30582 1726855282.69862: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855282.69864: Set connection var ansible_pipelining to False 30582 1726855282.69868: Set connection var ansible_shell_executable to /bin/sh 30582 1726855282.69871: Set connection var ansible_shell_type to sh 30582 1726855282.69891: variable 'ansible_shell_executable' from source: unknown 30582 1726855282.69893: variable 'ansible_connection' from source: unknown 30582 1726855282.69896: variable 'ansible_module_compression' from source: unknown 30582 1726855282.69899: variable 'ansible_shell_type' from source: unknown 30582 1726855282.69901: variable 'ansible_shell_executable' from source: unknown 30582 1726855282.69903: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855282.69905: variable 'ansible_pipelining' from source: unknown 30582 1726855282.69909: variable 'ansible_timeout' from source: unknown 30582 1726855282.69913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855282.70057: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855282.70064: variable 'omit' from source: magic vars 30582 1726855282.70070: starting attempt loop 30582 1726855282.70074: running the handler 30582 1726855282.70091: _low_level_execute_command(): starting 30582 1726855282.70098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855282.70581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855282.70618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855282.70621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855282.70624: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855282.70627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855282.70681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855282.70684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855282.70688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855282.70762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855282.72464: stdout chunk (state=3): >>>/root <<< 30582 1726855282.72553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855282.72588: stderr chunk (state=3): >>><<< 30582 1726855282.72592: stdout chunk (state=3): >>><<< 30582 1726855282.72611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855282.72623: _low_level_execute_command(): starting 30582 1726855282.72629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824 `" && echo ansible-tmp-1726855282.726118-31442-116644010315824="` echo /root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824 `" ) && sleep 0' 30582 1726855282.73052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855282.73061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855282.73085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855282.73099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855282.73102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855282.73151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855282.73154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855282.73160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855282.73220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855282.75110: stdout chunk (state=3): >>>ansible-tmp-1726855282.726118-31442-116644010315824=/root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824 <<< 30582 1726855282.75217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855282.75243: stderr chunk (state=3): >>><<< 30582 1726855282.75246: stdout chunk (state=3): >>><<< 30582 1726855282.75263: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855282.726118-31442-116644010315824=/root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855282.75307: variable 'ansible_module_compression' from source: unknown 30582 1726855282.75341: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855282.75373: variable 'ansible_facts' from source: unknown 30582 1726855282.75433: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/AnsiballZ_service_facts.py 30582 1726855282.75534: Sending initial data 30582 1726855282.75538: Sent initial data (161 bytes) 30582 1726855282.75968: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855282.75986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855282.75993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855282.75995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855282.76008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855282.76063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855282.76069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855282.76071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855282.76126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855282.77684: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30582 1726855282.77692: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855282.77739: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855282.77800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpsa1s6hf7 /root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/AnsiballZ_service_facts.py <<< 30582 1726855282.77804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/AnsiballZ_service_facts.py" <<< 30582 1726855282.77857: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpsa1s6hf7" to remote "/root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/AnsiballZ_service_facts.py" <<< 30582 1726855282.77860: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/AnsiballZ_service_facts.py" <<< 30582 1726855282.78454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855282.78493: stderr chunk (state=3): >>><<< 30582 1726855282.78496: stdout chunk (state=3): >>><<< 30582 1726855282.78560: done transferring module to remote 30582 1726855282.78568: _low_level_execute_command(): starting 30582 1726855282.78573: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/ /root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/AnsiballZ_service_facts.py && sleep 0' 30582 1726855282.78979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855282.79012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855282.79015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855282.79018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855282.79024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855282.79072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855282.79075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855282.79134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855282.80897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855282.80924: stderr chunk (state=3): >>><<< 30582 1726855282.80928: stdout chunk (state=3): >>><<< 30582 1726855282.80942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855282.80945: _low_level_execute_command(): starting 30582 1726855282.80949: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/AnsiballZ_service_facts.py && sleep 0' 30582 1726855282.81578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855282.81628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855282.81706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855284.33468: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30582 1726855284.33717: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855284.34917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855284.34930: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855284.35214: stderr chunk (state=3): >>><<< 30582 1726855284.35218: stdout chunk (state=3): >>><<< 30582 1726855284.35396: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855284.37256: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855284.37275: _low_level_execute_command(): starting 30582 1726855284.37286: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855282.726118-31442-116644010315824/ > /dev/null 2>&1 && sleep 0' 30582 1726855284.38711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855284.38810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855284.38883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855284.38910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855284.39072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855284.40944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855284.41110: stderr chunk (state=3): >>><<< 30582 1726855284.41120: stdout chunk (state=3): >>><<< 30582 1726855284.41197: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855284.41210: handler run complete 30582 1726855284.41614: variable 'ansible_facts' from source: unknown 30582 1726855284.41867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855284.42896: variable 'ansible_facts' from source: unknown 30582 1726855284.43153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855284.43607: attempt loop complete, returning result 30582 1726855284.43619: _execute() done 30582 1726855284.43627: dumping result to json 30582 1726855284.43693: done dumping result, returning 30582 1726855284.43892: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000000791] 30582 1726855284.43896: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000791 30582 1726855284.46118: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000791 30582 1726855284.46123: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855284.46191: no more pending results, returning what we have 30582 1726855284.46194: results queue empty 30582 1726855284.46195: checking for any_errors_fatal 30582 1726855284.46198: done checking for any_errors_fatal 30582 1726855284.46198: checking for max_fail_percentage 30582 1726855284.46200: done checking for max_fail_percentage 30582 1726855284.46201: checking to see if all hosts have failed and the running result is not ok 30582 1726855284.46202: done checking to see if all hosts have failed 30582 1726855284.46203: getting the remaining hosts for this loop 30582 1726855284.46205: done getting the remaining hosts for this loop 30582 1726855284.46208: getting the next task for host managed_node3 30582 1726855284.46214: done getting next task for host managed_node3 30582 1726855284.46218: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855284.46224: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855284.46233: getting variables 30582 1726855284.46235: in VariableManager get_vars() 30582 1726855284.46259: Calling all_inventory to load vars for managed_node3 30582 1726855284.46262: Calling groups_inventory to load vars for managed_node3 30582 1726855284.46265: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855284.46274: Calling all_plugins_play to load vars for managed_node3 30582 1726855284.46279: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855284.46282: Calling groups_plugins_play to load vars for managed_node3 30582 1726855284.50584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855284.55304: done with get_vars() 30582 1726855284.55329: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:01:24 -0400 (0:00:01.871) 0:00:20.907 ****** 30582 1726855284.55716: entering _queue_task() for managed_node3/package_facts 30582 1726855284.56566: worker is 1 (out of 1 available) 30582 1726855284.56581: exiting _queue_task() for managed_node3/package_facts 30582 1726855284.56596: done queuing things up, now waiting for results queue to drain 30582 1726855284.56597: waiting for pending results... 30582 1726855284.57038: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855284.57327: in run() - task 0affcc66-ac2b-aa83-7d57-000000000792 30582 1726855284.57492: variable 'ansible_search_path' from source: unknown 30582 1726855284.57497: variable 'ansible_search_path' from source: unknown 30582 1726855284.57533: calling self._execute() 30582 1726855284.57740: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855284.57744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855284.57757: variable 'omit' from source: magic vars 30582 1726855284.58693: variable 'ansible_distribution_major_version' from source: facts 30582 1726855284.58697: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855284.58699: variable 'omit' from source: magic vars 30582 1726855284.58829: variable 'omit' from source: magic vars 30582 1726855284.58863: variable 'omit' from source: magic vars 30582 1726855284.59120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855284.59124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855284.59126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855284.59180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855284.59183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855284.59323: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855284.59327: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855284.59330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855284.59559: Set connection var ansible_timeout to 10 30582 1726855284.59563: Set connection var ansible_connection to ssh 30582 1726855284.59569: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855284.59574: Set connection var ansible_pipelining to False 30582 1726855284.59583: Set connection var ansible_shell_executable to /bin/sh 30582 1726855284.59586: Set connection var ansible_shell_type to sh 30582 1726855284.59612: variable 'ansible_shell_executable' from source: unknown 30582 1726855284.59615: variable 'ansible_connection' from source: unknown 30582 1726855284.59618: variable 'ansible_module_compression' from source: unknown 30582 1726855284.59620: variable 'ansible_shell_type' from source: unknown 30582 1726855284.59700: variable 'ansible_shell_executable' from source: unknown 30582 1726855284.59703: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855284.59705: variable 'ansible_pipelining' from source: unknown 30582 1726855284.59708: variable 'ansible_timeout' from source: unknown 30582 1726855284.59710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855284.60205: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855284.60215: variable 'omit' from source: magic vars 30582 1726855284.60225: starting attempt loop 30582 1726855284.60229: running the handler 30582 1726855284.60245: _low_level_execute_command(): starting 30582 1726855284.60293: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855284.62309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855284.62521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855284.63026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855284.64576: stdout chunk (state=3): >>>/root <<< 30582 1726855284.64768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855284.64771: stderr chunk (state=3): >>><<< 30582 1726855284.64774: stdout chunk (state=3): >>><<< 30582 1726855284.64802: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855284.64817: _low_level_execute_command(): starting 30582 1726855284.64824: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906 `" && echo ansible-tmp-1726855284.6480317-31501-172060951591906="` echo /root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906 `" ) && sleep 0' 30582 1726855284.66198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855284.66203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855284.66270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855284.66304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855284.66363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855284.66441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855284.68531: stdout chunk (state=3): >>>ansible-tmp-1726855284.6480317-31501-172060951591906=/root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906 <<< 30582 1726855284.68636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855284.68650: stderr chunk (state=3): >>><<< 30582 1726855284.68653: stdout chunk (state=3): >>><<< 30582 1726855284.68744: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855284.6480317-31501-172060951591906=/root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855284.68769: variable 'ansible_module_compression' from source: unknown 30582 1726855284.68961: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855284.69103: variable 'ansible_facts' from source: unknown 30582 1726855284.69938: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/AnsiballZ_package_facts.py 30582 1726855284.70216: Sending initial data 30582 1726855284.70219: Sent initial data (162 bytes) 30582 1726855284.71410: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855284.71552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855284.71568: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855284.71578: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855284.71591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855284.71600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855284.71619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855284.71633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855284.71641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855284.71648: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855284.71657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855284.71848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855284.71854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855284.71965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855284.73592: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855284.73603: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30582 1726855284.73609: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30582 1726855284.73616: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 30582 1726855284.73635: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 30582 1726855284.73638: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855284.73729: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855284.73826: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpl1tziojy /root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/AnsiballZ_package_facts.py <<< 30582 1726855284.73829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/AnsiballZ_package_facts.py" <<< 30582 1726855284.73869: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30582 1726855284.73884: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpl1tziojy" to remote "/root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/AnsiballZ_package_facts.py" <<< 30582 1726855284.75512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855284.75535: stdout chunk (state=3): >>><<< 30582 1726855284.75540: stderr chunk (state=3): >>><<< 30582 1726855284.75579: done transferring module to remote 30582 1726855284.75691: _low_level_execute_command(): starting 30582 1726855284.75695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/ /root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/AnsiballZ_package_facts.py && sleep 0' 30582 1726855284.76210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855284.76230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855284.76283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855284.78180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855284.78184: stdout chunk (state=3): >>><<< 30582 1726855284.78188: stderr chunk (state=3): >>><<< 30582 1726855284.78310: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855284.78315: _low_level_execute_command(): starting 30582 1726855284.78318: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/AnsiballZ_package_facts.py && sleep 0' 30582 1726855284.78813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855284.78823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855284.78832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855284.78846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855284.78860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855284.78864: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855284.78964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855284.78967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855284.78969: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855284.78971: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855284.78973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855284.78975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855284.78976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855284.78978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855284.78980: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855284.79017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855284.79059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855284.79062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855284.79096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855284.79218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855285.23204: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30582 1726855285.23220: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30582 1726855285.23229: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30582 1726855285.23277: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30582 1726855285.23286: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30582 1726855285.23306: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30582 1726855285.23330: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30582 1726855285.23340: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30582 1726855285.23366: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30582 1726855285.23384: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30582 1726855285.23409: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855285.25117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855285.25171: stderr chunk (state=3): >>><<< 30582 1726855285.25174: stdout chunk (state=3): >>><<< 30582 1726855285.25404: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855285.27093: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855285.27109: _low_level_execute_command(): starting 30582 1726855285.27114: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855284.6480317-31501-172060951591906/ > /dev/null 2>&1 && sleep 0' 30582 1726855285.27554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855285.27558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855285.27560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855285.27562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855285.27564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855285.27619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855285.27626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855285.27688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855285.29586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855285.29591: stdout chunk (state=3): >>><<< 30582 1726855285.29594: stderr chunk (state=3): >>><<< 30582 1726855285.29693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855285.29696: handler run complete 30582 1726855285.30570: variable 'ansible_facts' from source: unknown 30582 1726855285.30854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.32554: variable 'ansible_facts' from source: unknown 30582 1726855285.32925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.33506: attempt loop complete, returning result 30582 1726855285.33516: _execute() done 30582 1726855285.33518: dumping result to json 30582 1726855285.33637: done dumping result, returning 30582 1726855285.33655: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000000792] 30582 1726855285.33658: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000792 30582 1726855285.40139: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000792 30582 1726855285.40143: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855285.40284: no more pending results, returning what we have 30582 1726855285.40289: results queue empty 30582 1726855285.40290: checking for any_errors_fatal 30582 1726855285.40297: done checking for any_errors_fatal 30582 1726855285.40297: checking for max_fail_percentage 30582 1726855285.40299: done checking for max_fail_percentage 30582 1726855285.40300: checking to see if all hosts have failed and the running result is not ok 30582 1726855285.40301: done checking to see if all hosts have failed 30582 1726855285.40302: getting the remaining hosts for this loop 30582 1726855285.40303: done getting the remaining hosts for this loop 30582 1726855285.40306: getting the next task for host managed_node3 30582 1726855285.40314: done getting next task for host managed_node3 30582 1726855285.40318: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855285.40322: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855285.40333: getting variables 30582 1726855285.40334: in VariableManager get_vars() 30582 1726855285.40362: Calling all_inventory to load vars for managed_node3 30582 1726855285.40365: Calling groups_inventory to load vars for managed_node3 30582 1726855285.40367: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855285.40376: Calling all_plugins_play to load vars for managed_node3 30582 1726855285.40379: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855285.40382: Calling groups_plugins_play to load vars for managed_node3 30582 1726855285.41554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.43106: done with get_vars() 30582 1726855285.43135: done getting variables 30582 1726855285.43204: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:01:25 -0400 (0:00:00.875) 0:00:21.782 ****** 30582 1726855285.43242: entering _queue_task() for managed_node3/debug 30582 1726855285.43600: worker is 1 (out of 1 available) 30582 1726855285.43614: exiting _queue_task() for managed_node3/debug 30582 1726855285.43626: done queuing things up, now waiting for results queue to drain 30582 1726855285.43628: waiting for pending results... 30582 1726855285.44012: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855285.44069: in run() - task 0affcc66-ac2b-aa83-7d57-000000000730 30582 1726855285.44092: variable 'ansible_search_path' from source: unknown 30582 1726855285.44104: variable 'ansible_search_path' from source: unknown 30582 1726855285.44146: calling self._execute() 30582 1726855285.44246: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855285.44259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855285.44275: variable 'omit' from source: magic vars 30582 1726855285.44762: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.44765: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855285.44768: variable 'omit' from source: magic vars 30582 1726855285.44770: variable 'omit' from source: magic vars 30582 1726855285.44856: variable 'network_provider' from source: set_fact 30582 1726855285.44882: variable 'omit' from source: magic vars 30582 1726855285.44929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855285.44968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855285.44998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855285.45020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855285.45036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855285.45071: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855285.45082: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855285.45095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855285.45195: Set connection var ansible_timeout to 10 30582 1726855285.45203: Set connection var ansible_connection to ssh 30582 1726855285.45214: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855285.45223: Set connection var ansible_pipelining to False 30582 1726855285.45231: Set connection var ansible_shell_executable to /bin/sh 30582 1726855285.45237: Set connection var ansible_shell_type to sh 30582 1726855285.45261: variable 'ansible_shell_executable' from source: unknown 30582 1726855285.45300: variable 'ansible_connection' from source: unknown 30582 1726855285.45303: variable 'ansible_module_compression' from source: unknown 30582 1726855285.45305: variable 'ansible_shell_type' from source: unknown 30582 1726855285.45307: variable 'ansible_shell_executable' from source: unknown 30582 1726855285.45309: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855285.45311: variable 'ansible_pipelining' from source: unknown 30582 1726855285.45313: variable 'ansible_timeout' from source: unknown 30582 1726855285.45315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855285.45435: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855285.45450: variable 'omit' from source: magic vars 30582 1726855285.45458: starting attempt loop 30582 1726855285.45517: running the handler 30582 1726855285.45520: handler run complete 30582 1726855285.45533: attempt loop complete, returning result 30582 1726855285.45539: _execute() done 30582 1726855285.45544: dumping result to json 30582 1726855285.45550: done dumping result, returning 30582 1726855285.45561: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-000000000730] 30582 1726855285.45568: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000730 ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855285.45796: no more pending results, returning what we have 30582 1726855285.45801: results queue empty 30582 1726855285.45802: checking for any_errors_fatal 30582 1726855285.45814: done checking for any_errors_fatal 30582 1726855285.45815: checking for max_fail_percentage 30582 1726855285.45817: done checking for max_fail_percentage 30582 1726855285.45818: checking to see if all hosts have failed and the running result is not ok 30582 1726855285.45819: done checking to see if all hosts have failed 30582 1726855285.45819: getting the remaining hosts for this loop 30582 1726855285.45821: done getting the remaining hosts for this loop 30582 1726855285.45824: getting the next task for host managed_node3 30582 1726855285.45834: done getting next task for host managed_node3 30582 1726855285.45838: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855285.45843: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855285.45855: getting variables 30582 1726855285.45857: in VariableManager get_vars() 30582 1726855285.46094: Calling all_inventory to load vars for managed_node3 30582 1726855285.46097: Calling groups_inventory to load vars for managed_node3 30582 1726855285.46099: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855285.46110: Calling all_plugins_play to load vars for managed_node3 30582 1726855285.46112: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855285.46115: Calling groups_plugins_play to load vars for managed_node3 30582 1726855285.46802: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000730 30582 1726855285.46806: WORKER PROCESS EXITING 30582 1726855285.51913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.53438: done with get_vars() 30582 1726855285.53470: done getting variables 30582 1726855285.53522: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:01:25 -0400 (0:00:00.103) 0:00:21.885 ****** 30582 1726855285.53559: entering _queue_task() for managed_node3/fail 30582 1726855285.53910: worker is 1 (out of 1 available) 30582 1726855285.53923: exiting _queue_task() for managed_node3/fail 30582 1726855285.53934: done queuing things up, now waiting for results queue to drain 30582 1726855285.53936: waiting for pending results... 30582 1726855285.54244: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855285.54404: in run() - task 0affcc66-ac2b-aa83-7d57-000000000731 30582 1726855285.54429: variable 'ansible_search_path' from source: unknown 30582 1726855285.54437: variable 'ansible_search_path' from source: unknown 30582 1726855285.54475: calling self._execute() 30582 1726855285.54576: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855285.54594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855285.54610: variable 'omit' from source: magic vars 30582 1726855285.54971: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.54982: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855285.55080: variable 'network_state' from source: role '' defaults 30582 1726855285.55086: Evaluated conditional (network_state != {}): False 30582 1726855285.55091: when evaluation is False, skipping this task 30582 1726855285.55094: _execute() done 30582 1726855285.55097: dumping result to json 30582 1726855285.55101: done dumping result, returning 30582 1726855285.55108: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-000000000731] 30582 1726855285.55111: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000731 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855285.55250: no more pending results, returning what we have 30582 1726855285.55254: results queue empty 30582 1726855285.55255: checking for any_errors_fatal 30582 1726855285.55264: done checking for any_errors_fatal 30582 1726855285.55264: checking for max_fail_percentage 30582 1726855285.55266: done checking for max_fail_percentage 30582 1726855285.55267: checking to see if all hosts have failed and the running result is not ok 30582 1726855285.55267: done checking to see if all hosts have failed 30582 1726855285.55268: getting the remaining hosts for this loop 30582 1726855285.55269: done getting the remaining hosts for this loop 30582 1726855285.55284: getting the next task for host managed_node3 30582 1726855285.55295: done getting next task for host managed_node3 30582 1726855285.55299: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855285.55304: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855285.55316: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000731 30582 1726855285.55319: WORKER PROCESS EXITING 30582 1726855285.55330: getting variables 30582 1726855285.55331: in VariableManager get_vars() 30582 1726855285.55363: Calling all_inventory to load vars for managed_node3 30582 1726855285.55366: Calling groups_inventory to load vars for managed_node3 30582 1726855285.55367: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855285.55380: Calling all_plugins_play to load vars for managed_node3 30582 1726855285.55382: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855285.55395: Calling groups_plugins_play to load vars for managed_node3 30582 1726855285.56173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.57252: done with get_vars() 30582 1726855285.57277: done getting variables 30582 1726855285.57345: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:01:25 -0400 (0:00:00.038) 0:00:21.923 ****** 30582 1726855285.57384: entering _queue_task() for managed_node3/fail 30582 1726855285.57674: worker is 1 (out of 1 available) 30582 1726855285.57693: exiting _queue_task() for managed_node3/fail 30582 1726855285.57705: done queuing things up, now waiting for results queue to drain 30582 1726855285.57707: waiting for pending results... 30582 1726855285.57926: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855285.58092: in run() - task 0affcc66-ac2b-aa83-7d57-000000000732 30582 1726855285.58097: variable 'ansible_search_path' from source: unknown 30582 1726855285.58101: variable 'ansible_search_path' from source: unknown 30582 1726855285.58293: calling self._execute() 30582 1726855285.58297: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855285.58301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855285.58303: variable 'omit' from source: magic vars 30582 1726855285.58628: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.58646: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855285.58777: variable 'network_state' from source: role '' defaults 30582 1726855285.58796: Evaluated conditional (network_state != {}): False 30582 1726855285.58806: when evaluation is False, skipping this task 30582 1726855285.58815: _execute() done 30582 1726855285.58823: dumping result to json 30582 1726855285.58833: done dumping result, returning 30582 1726855285.58846: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-000000000732] 30582 1726855285.58861: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000732 30582 1726855285.58963: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000732 30582 1726855285.58966: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855285.59022: no more pending results, returning what we have 30582 1726855285.59026: results queue empty 30582 1726855285.59026: checking for any_errors_fatal 30582 1726855285.59032: done checking for any_errors_fatal 30582 1726855285.59033: checking for max_fail_percentage 30582 1726855285.59036: done checking for max_fail_percentage 30582 1726855285.59036: checking to see if all hosts have failed and the running result is not ok 30582 1726855285.59037: done checking to see if all hosts have failed 30582 1726855285.59037: getting the remaining hosts for this loop 30582 1726855285.59039: done getting the remaining hosts for this loop 30582 1726855285.59042: getting the next task for host managed_node3 30582 1726855285.59051: done getting next task for host managed_node3 30582 1726855285.59055: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855285.59061: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855285.59083: getting variables 30582 1726855285.59084: in VariableManager get_vars() 30582 1726855285.59119: Calling all_inventory to load vars for managed_node3 30582 1726855285.59121: Calling groups_inventory to load vars for managed_node3 30582 1726855285.59123: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855285.59145: Calling all_plugins_play to load vars for managed_node3 30582 1726855285.59148: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855285.59150: Calling groups_plugins_play to load vars for managed_node3 30582 1726855285.60028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.60915: done with get_vars() 30582 1726855285.60931: done getting variables 30582 1726855285.60974: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:01:25 -0400 (0:00:00.036) 0:00:21.959 ****** 30582 1726855285.61003: entering _queue_task() for managed_node3/fail 30582 1726855285.61345: worker is 1 (out of 1 available) 30582 1726855285.61360: exiting _queue_task() for managed_node3/fail 30582 1726855285.61372: done queuing things up, now waiting for results queue to drain 30582 1726855285.61374: waiting for pending results... 30582 1726855285.61608: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855285.61764: in run() - task 0affcc66-ac2b-aa83-7d57-000000000733 30582 1726855285.61792: variable 'ansible_search_path' from source: unknown 30582 1726855285.61813: variable 'ansible_search_path' from source: unknown 30582 1726855285.61859: calling self._execute() 30582 1726855285.61997: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855285.62035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855285.62149: variable 'omit' from source: magic vars 30582 1726855285.62555: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.62612: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855285.62782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855285.64759: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855285.64851: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855285.64902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855285.65093: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855285.65096: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855285.65099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.65103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.65110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.65154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.65173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.65271: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.65303: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855285.65423: variable 'ansible_distribution' from source: facts 30582 1726855285.65431: variable '__network_rh_distros' from source: role '' defaults 30582 1726855285.65443: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855285.66057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.66062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.66065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.66068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.66071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.66074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.66117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.66147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.66193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.66212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.66257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.66291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.66319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.66361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.66383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.66759: variable 'network_connections' from source: include params 30582 1726855285.66769: variable 'interface' from source: play vars 30582 1726855285.66826: variable 'interface' from source: play vars 30582 1726855285.66838: variable 'network_state' from source: role '' defaults 30582 1726855285.66885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855285.67019: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855285.67058: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855285.67083: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855285.67106: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855285.67138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855285.67155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855285.67177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.67199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855285.67227: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855285.67230: when evaluation is False, skipping this task 30582 1726855285.67233: _execute() done 30582 1726855285.67236: dumping result to json 30582 1726855285.67238: done dumping result, returning 30582 1726855285.67245: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-000000000733] 30582 1726855285.67255: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000733 30582 1726855285.67338: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000733 30582 1726855285.67341: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855285.67388: no more pending results, returning what we have 30582 1726855285.67392: results queue empty 30582 1726855285.67393: checking for any_errors_fatal 30582 1726855285.67401: done checking for any_errors_fatal 30582 1726855285.67401: checking for max_fail_percentage 30582 1726855285.67404: done checking for max_fail_percentage 30582 1726855285.67405: checking to see if all hosts have failed and the running result is not ok 30582 1726855285.67405: done checking to see if all hosts have failed 30582 1726855285.67406: getting the remaining hosts for this loop 30582 1726855285.67407: done getting the remaining hosts for this loop 30582 1726855285.67411: getting the next task for host managed_node3 30582 1726855285.67419: done getting next task for host managed_node3 30582 1726855285.67423: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855285.67428: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855285.67443: getting variables 30582 1726855285.67445: in VariableManager get_vars() 30582 1726855285.67481: Calling all_inventory to load vars for managed_node3 30582 1726855285.67484: Calling groups_inventory to load vars for managed_node3 30582 1726855285.67486: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855285.67498: Calling all_plugins_play to load vars for managed_node3 30582 1726855285.67500: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855285.67503: Calling groups_plugins_play to load vars for managed_node3 30582 1726855285.68324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.69452: done with get_vars() 30582 1726855285.69477: done getting variables 30582 1726855285.69538: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:01:25 -0400 (0:00:00.085) 0:00:22.045 ****** 30582 1726855285.69569: entering _queue_task() for managed_node3/dnf 30582 1726855285.69900: worker is 1 (out of 1 available) 30582 1726855285.69913: exiting _queue_task() for managed_node3/dnf 30582 1726855285.69924: done queuing things up, now waiting for results queue to drain 30582 1726855285.69926: waiting for pending results... 30582 1726855285.70306: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855285.70493: in run() - task 0affcc66-ac2b-aa83-7d57-000000000734 30582 1726855285.70497: variable 'ansible_search_path' from source: unknown 30582 1726855285.70499: variable 'ansible_search_path' from source: unknown 30582 1726855285.70503: calling self._execute() 30582 1726855285.70505: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855285.70508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855285.70511: variable 'omit' from source: magic vars 30582 1726855285.71011: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.71015: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855285.71120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855285.73516: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855285.73600: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855285.73656: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855285.73700: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855285.73745: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855285.73838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.73872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.73901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.73947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.73967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.74098: variable 'ansible_distribution' from source: facts 30582 1726855285.74108: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.74125: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855285.74248: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855285.74395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.74423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.74451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.74502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.74594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.74597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.74600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.74618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.74661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.74682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.74736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.74763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.74792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.74919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.74922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.75036: variable 'network_connections' from source: include params 30582 1726855285.75057: variable 'interface' from source: play vars 30582 1726855285.75124: variable 'interface' from source: play vars 30582 1726855285.75208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855285.75399: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855285.75441: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855285.75572: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855285.75576: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855285.75578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855285.75599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855285.75637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.75667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855285.75733: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855285.75983: variable 'network_connections' from source: include params 30582 1726855285.76002: variable 'interface' from source: play vars 30582 1726855285.76111: variable 'interface' from source: play vars 30582 1726855285.76114: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855285.76117: when evaluation is False, skipping this task 30582 1726855285.76123: _execute() done 30582 1726855285.76127: dumping result to json 30582 1726855285.76132: done dumping result, returning 30582 1726855285.76144: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000734] 30582 1726855285.76154: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000734 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855285.76497: no more pending results, returning what we have 30582 1726855285.76502: results queue empty 30582 1726855285.76503: checking for any_errors_fatal 30582 1726855285.76510: done checking for any_errors_fatal 30582 1726855285.76511: checking for max_fail_percentage 30582 1726855285.76513: done checking for max_fail_percentage 30582 1726855285.76514: checking to see if all hosts have failed and the running result is not ok 30582 1726855285.76515: done checking to see if all hosts have failed 30582 1726855285.76516: getting the remaining hosts for this loop 30582 1726855285.76517: done getting the remaining hosts for this loop 30582 1726855285.76522: getting the next task for host managed_node3 30582 1726855285.76533: done getting next task for host managed_node3 30582 1726855285.76537: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855285.76544: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855285.76564: getting variables 30582 1726855285.76566: in VariableManager get_vars() 30582 1726855285.76607: Calling all_inventory to load vars for managed_node3 30582 1726855285.76610: Calling groups_inventory to load vars for managed_node3 30582 1726855285.76613: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855285.76624: Calling all_plugins_play to load vars for managed_node3 30582 1726855285.76627: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855285.76630: Calling groups_plugins_play to load vars for managed_node3 30582 1726855285.77229: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000734 30582 1726855285.77241: WORKER PROCESS EXITING 30582 1726855285.78261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.79885: done with get_vars() 30582 1726855285.79908: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855285.79997: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:01:25 -0400 (0:00:00.104) 0:00:22.150 ****** 30582 1726855285.80030: entering _queue_task() for managed_node3/yum 30582 1726855285.80524: worker is 1 (out of 1 available) 30582 1726855285.80535: exiting _queue_task() for managed_node3/yum 30582 1726855285.80546: done queuing things up, now waiting for results queue to drain 30582 1726855285.80547: waiting for pending results... 30582 1726855285.80762: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855285.80913: in run() - task 0affcc66-ac2b-aa83-7d57-000000000735 30582 1726855285.80942: variable 'ansible_search_path' from source: unknown 30582 1726855285.80952: variable 'ansible_search_path' from source: unknown 30582 1726855285.81000: calling self._execute() 30582 1726855285.81105: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855285.81117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855285.81131: variable 'omit' from source: magic vars 30582 1726855285.81757: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.81763: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855285.81984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855285.83704: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855285.84202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855285.84205: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855285.84207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855285.84210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855285.84262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.84301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.84344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.84394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.84427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.84533: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.84635: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855285.84643: when evaluation is False, skipping this task 30582 1726855285.84645: _execute() done 30582 1726855285.84647: dumping result to json 30582 1726855285.84649: done dumping result, returning 30582 1726855285.84652: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000735] 30582 1726855285.84654: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000735 30582 1726855285.84740: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000735 30582 1726855285.84743: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855285.85067: no more pending results, returning what we have 30582 1726855285.85078: results queue empty 30582 1726855285.85080: checking for any_errors_fatal 30582 1726855285.85089: done checking for any_errors_fatal 30582 1726855285.85090: checking for max_fail_percentage 30582 1726855285.85093: done checking for max_fail_percentage 30582 1726855285.85094: checking to see if all hosts have failed and the running result is not ok 30582 1726855285.85094: done checking to see if all hosts have failed 30582 1726855285.85095: getting the remaining hosts for this loop 30582 1726855285.85097: done getting the remaining hosts for this loop 30582 1726855285.85101: getting the next task for host managed_node3 30582 1726855285.85114: done getting next task for host managed_node3 30582 1726855285.85119: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855285.85124: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855285.85142: getting variables 30582 1726855285.85143: in VariableManager get_vars() 30582 1726855285.85176: Calling all_inventory to load vars for managed_node3 30582 1726855285.85178: Calling groups_inventory to load vars for managed_node3 30582 1726855285.85299: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855285.85309: Calling all_plugins_play to load vars for managed_node3 30582 1726855285.85312: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855285.85315: Calling groups_plugins_play to load vars for managed_node3 30582 1726855285.87179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.89172: done with get_vars() 30582 1726855285.89197: done getting variables 30582 1726855285.89244: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:01:25 -0400 (0:00:00.092) 0:00:22.242 ****** 30582 1726855285.89269: entering _queue_task() for managed_node3/fail 30582 1726855285.89538: worker is 1 (out of 1 available) 30582 1726855285.89554: exiting _queue_task() for managed_node3/fail 30582 1726855285.89566: done queuing things up, now waiting for results queue to drain 30582 1726855285.89568: waiting for pending results... 30582 1726855285.89756: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855285.89855: in run() - task 0affcc66-ac2b-aa83-7d57-000000000736 30582 1726855285.89865: variable 'ansible_search_path' from source: unknown 30582 1726855285.89869: variable 'ansible_search_path' from source: unknown 30582 1726855285.89907: calling self._execute() 30582 1726855285.89969: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855285.89973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855285.89980: variable 'omit' from source: magic vars 30582 1726855285.90353: variable 'ansible_distribution_major_version' from source: facts 30582 1726855285.90407: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855285.90524: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855285.90728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855285.92932: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855285.92993: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855285.93076: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855285.93080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855285.93115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855285.93173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.93202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.93233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.93263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.93284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.93329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.93351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.93391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.93427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.93450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.93485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855285.93559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855285.93562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.93569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855285.93591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855285.94113: variable 'network_connections' from source: include params 30582 1726855285.94117: variable 'interface' from source: play vars 30582 1726855285.94241: variable 'interface' from source: play vars 30582 1726855285.94414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855285.94515: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855285.94565: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855285.94601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855285.94634: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855285.94672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855285.94700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855285.94726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855285.94750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855285.94850: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855285.95069: variable 'network_connections' from source: include params 30582 1726855285.95072: variable 'interface' from source: play vars 30582 1726855285.95132: variable 'interface' from source: play vars 30582 1726855285.95167: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855285.95179: when evaluation is False, skipping this task 30582 1726855285.95181: _execute() done 30582 1726855285.95183: dumping result to json 30582 1726855285.95186: done dumping result, returning 30582 1726855285.95190: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000736] 30582 1726855285.95196: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000736 30582 1726855285.95502: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000736 30582 1726855285.95505: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855285.95544: no more pending results, returning what we have 30582 1726855285.95547: results queue empty 30582 1726855285.95548: checking for any_errors_fatal 30582 1726855285.95554: done checking for any_errors_fatal 30582 1726855285.95555: checking for max_fail_percentage 30582 1726855285.95556: done checking for max_fail_percentage 30582 1726855285.95557: checking to see if all hosts have failed and the running result is not ok 30582 1726855285.95557: done checking to see if all hosts have failed 30582 1726855285.95558: getting the remaining hosts for this loop 30582 1726855285.95559: done getting the remaining hosts for this loop 30582 1726855285.95562: getting the next task for host managed_node3 30582 1726855285.95568: done getting next task for host managed_node3 30582 1726855285.95572: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855285.95576: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855285.95593: getting variables 30582 1726855285.95594: in VariableManager get_vars() 30582 1726855285.95631: Calling all_inventory to load vars for managed_node3 30582 1726855285.95634: Calling groups_inventory to load vars for managed_node3 30582 1726855285.95637: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855285.95646: Calling all_plugins_play to load vars for managed_node3 30582 1726855285.95649: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855285.95652: Calling groups_plugins_play to load vars for managed_node3 30582 1726855285.97273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855285.99201: done with get_vars() 30582 1726855285.99233: done getting variables 30582 1726855285.99315: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:01:25 -0400 (0:00:00.100) 0:00:22.343 ****** 30582 1726855285.99353: entering _queue_task() for managed_node3/package 30582 1726855286.00022: worker is 1 (out of 1 available) 30582 1726855286.00032: exiting _queue_task() for managed_node3/package 30582 1726855286.00042: done queuing things up, now waiting for results queue to drain 30582 1726855286.00043: waiting for pending results... 30582 1726855286.00294: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855286.00426: in run() - task 0affcc66-ac2b-aa83-7d57-000000000737 30582 1726855286.00453: variable 'ansible_search_path' from source: unknown 30582 1726855286.00463: variable 'ansible_search_path' from source: unknown 30582 1726855286.00546: calling self._execute() 30582 1726855286.00630: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855286.00645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855286.00671: variable 'omit' from source: magic vars 30582 1726855286.01152: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.01202: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855286.01517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855286.01859: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855286.01936: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855286.02235: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855286.02654: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855286.02906: variable 'network_packages' from source: role '' defaults 30582 1726855286.03235: variable '__network_provider_setup' from source: role '' defaults 30582 1726855286.03238: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855286.03452: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855286.03455: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855286.03527: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855286.04199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855286.09886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855286.10081: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855286.10157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855286.10393: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855286.10397: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855286.10533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.10640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.10678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.10843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.10950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.11058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.11081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.11114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.11290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.11306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.11855: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855286.12219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.12271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.12571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.12578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.12581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.12791: variable 'ansible_python' from source: facts 30582 1726855286.12809: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855286.13022: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855286.13214: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855286.13520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.13596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.13631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.13800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.13820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.13876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.14093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.14098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.14300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.14304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.14562: variable 'network_connections' from source: include params 30582 1726855286.14629: variable 'interface' from source: play vars 30582 1726855286.14814: variable 'interface' from source: play vars 30582 1726855286.15094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855286.15120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855286.15198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.15392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855286.15427: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855286.16118: variable 'network_connections' from source: include params 30582 1726855286.16190: variable 'interface' from source: play vars 30582 1726855286.16513: variable 'interface' from source: play vars 30582 1726855286.16517: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855286.16672: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855286.17633: variable 'network_connections' from source: include params 30582 1726855286.17636: variable 'interface' from source: play vars 30582 1726855286.17694: variable 'interface' from source: play vars 30582 1726855286.17770: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855286.18092: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855286.18808: variable 'network_connections' from source: include params 30582 1726855286.18811: variable 'interface' from source: play vars 30582 1726855286.18992: variable 'interface' from source: play vars 30582 1726855286.19004: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855286.19164: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855286.19202: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855286.19385: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855286.19856: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855286.20972: variable 'network_connections' from source: include params 30582 1726855286.20995: variable 'interface' from source: play vars 30582 1726855286.21170: variable 'interface' from source: play vars 30582 1726855286.21190: variable 'ansible_distribution' from source: facts 30582 1726855286.21205: variable '__network_rh_distros' from source: role '' defaults 30582 1726855286.21219: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.21250: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855286.21746: variable 'ansible_distribution' from source: facts 30582 1726855286.21749: variable '__network_rh_distros' from source: role '' defaults 30582 1726855286.21762: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.21776: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855286.22063: variable 'ansible_distribution' from source: facts 30582 1726855286.22392: variable '__network_rh_distros' from source: role '' defaults 30582 1726855286.22397: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.22400: variable 'network_provider' from source: set_fact 30582 1726855286.22402: variable 'ansible_facts' from source: unknown 30582 1726855286.23869: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855286.23873: when evaluation is False, skipping this task 30582 1726855286.23875: _execute() done 30582 1726855286.23877: dumping result to json 30582 1726855286.23879: done dumping result, returning 30582 1726855286.23881: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000000737] 30582 1726855286.23883: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000737 30582 1726855286.24203: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000737 30582 1726855286.24206: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855286.24270: no more pending results, returning what we have 30582 1726855286.24275: results queue empty 30582 1726855286.24276: checking for any_errors_fatal 30582 1726855286.24285: done checking for any_errors_fatal 30582 1726855286.24286: checking for max_fail_percentage 30582 1726855286.24292: done checking for max_fail_percentage 30582 1726855286.24293: checking to see if all hosts have failed and the running result is not ok 30582 1726855286.24293: done checking to see if all hosts have failed 30582 1726855286.24294: getting the remaining hosts for this loop 30582 1726855286.24295: done getting the remaining hosts for this loop 30582 1726855286.24300: getting the next task for host managed_node3 30582 1726855286.24310: done getting next task for host managed_node3 30582 1726855286.24322: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855286.24328: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855286.24346: getting variables 30582 1726855286.24350: in VariableManager get_vars() 30582 1726855286.24553: Calling all_inventory to load vars for managed_node3 30582 1726855286.24557: Calling groups_inventory to load vars for managed_node3 30582 1726855286.24559: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855286.24572: Calling all_plugins_play to load vars for managed_node3 30582 1726855286.24575: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855286.24579: Calling groups_plugins_play to load vars for managed_node3 30582 1726855286.26876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855286.29818: done with get_vars() 30582 1726855286.29878: done getting variables 30582 1726855286.30198: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:01:26 -0400 (0:00:00.308) 0:00:22.652 ****** 30582 1726855286.30234: entering _queue_task() for managed_node3/package 30582 1726855286.31082: worker is 1 (out of 1 available) 30582 1726855286.31100: exiting _queue_task() for managed_node3/package 30582 1726855286.31112: done queuing things up, now waiting for results queue to drain 30582 1726855286.31114: waiting for pending results... 30582 1726855286.31634: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855286.31805: in run() - task 0affcc66-ac2b-aa83-7d57-000000000738 30582 1726855286.31811: variable 'ansible_search_path' from source: unknown 30582 1726855286.31814: variable 'ansible_search_path' from source: unknown 30582 1726855286.31920: calling self._execute() 30582 1726855286.32021: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855286.32026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855286.32033: variable 'omit' from source: magic vars 30582 1726855286.32484: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.32497: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855286.32749: variable 'network_state' from source: role '' defaults 30582 1726855286.32761: Evaluated conditional (network_state != {}): False 30582 1726855286.32764: when evaluation is False, skipping this task 30582 1726855286.32767: _execute() done 30582 1726855286.32769: dumping result to json 30582 1726855286.32772: done dumping result, returning 30582 1726855286.32790: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000000738] 30582 1726855286.32793: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000738 30582 1726855286.33042: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000738 30582 1726855286.33046: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855286.33101: no more pending results, returning what we have 30582 1726855286.33105: results queue empty 30582 1726855286.33106: checking for any_errors_fatal 30582 1726855286.33113: done checking for any_errors_fatal 30582 1726855286.33113: checking for max_fail_percentage 30582 1726855286.33116: done checking for max_fail_percentage 30582 1726855286.33116: checking to see if all hosts have failed and the running result is not ok 30582 1726855286.33117: done checking to see if all hosts have failed 30582 1726855286.33118: getting the remaining hosts for this loop 30582 1726855286.33119: done getting the remaining hosts for this loop 30582 1726855286.33124: getting the next task for host managed_node3 30582 1726855286.33133: done getting next task for host managed_node3 30582 1726855286.33137: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855286.33142: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855286.33162: getting variables 30582 1726855286.33164: in VariableManager get_vars() 30582 1726855286.33212: Calling all_inventory to load vars for managed_node3 30582 1726855286.33215: Calling groups_inventory to load vars for managed_node3 30582 1726855286.33217: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855286.33233: Calling all_plugins_play to load vars for managed_node3 30582 1726855286.33237: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855286.33245: Calling groups_plugins_play to load vars for managed_node3 30582 1726855286.36361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855286.39061: done with get_vars() 30582 1726855286.39086: done getting variables 30582 1726855286.39146: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:01:26 -0400 (0:00:00.089) 0:00:22.741 ****** 30582 1726855286.39185: entering _queue_task() for managed_node3/package 30582 1726855286.39702: worker is 1 (out of 1 available) 30582 1726855286.39720: exiting _queue_task() for managed_node3/package 30582 1726855286.39733: done queuing things up, now waiting for results queue to drain 30582 1726855286.39735: waiting for pending results... 30582 1726855286.40218: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855286.40745: in run() - task 0affcc66-ac2b-aa83-7d57-000000000739 30582 1726855286.40749: variable 'ansible_search_path' from source: unknown 30582 1726855286.40751: variable 'ansible_search_path' from source: unknown 30582 1726855286.40755: calling self._execute() 30582 1726855286.40881: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855286.40951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855286.40967: variable 'omit' from source: magic vars 30582 1726855286.42394: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.42397: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855286.42643: variable 'network_state' from source: role '' defaults 30582 1726855286.42657: Evaluated conditional (network_state != {}): False 30582 1726855286.42666: when evaluation is False, skipping this task 30582 1726855286.42673: _execute() done 30582 1726855286.42682: dumping result to json 30582 1726855286.42691: done dumping result, returning 30582 1726855286.42710: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000000739] 30582 1726855286.42725: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000739 30582 1726855286.43196: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000739 30582 1726855286.43200: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855286.43237: no more pending results, returning what we have 30582 1726855286.43241: results queue empty 30582 1726855286.43242: checking for any_errors_fatal 30582 1726855286.43248: done checking for any_errors_fatal 30582 1726855286.43249: checking for max_fail_percentage 30582 1726855286.43251: done checking for max_fail_percentage 30582 1726855286.43252: checking to see if all hosts have failed and the running result is not ok 30582 1726855286.43253: done checking to see if all hosts have failed 30582 1726855286.43254: getting the remaining hosts for this loop 30582 1726855286.43255: done getting the remaining hosts for this loop 30582 1726855286.43258: getting the next task for host managed_node3 30582 1726855286.43265: done getting next task for host managed_node3 30582 1726855286.43269: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855286.43277: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855286.43297: getting variables 30582 1726855286.43298: in VariableManager get_vars() 30582 1726855286.43329: Calling all_inventory to load vars for managed_node3 30582 1726855286.43332: Calling groups_inventory to load vars for managed_node3 30582 1726855286.43334: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855286.43344: Calling all_plugins_play to load vars for managed_node3 30582 1726855286.43347: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855286.43351: Calling groups_plugins_play to load vars for managed_node3 30582 1726855286.45614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855286.47400: done with get_vars() 30582 1726855286.47428: done getting variables 30582 1726855286.47498: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:01:26 -0400 (0:00:00.083) 0:00:22.825 ****** 30582 1726855286.47544: entering _queue_task() for managed_node3/service 30582 1726855286.47984: worker is 1 (out of 1 available) 30582 1726855286.48153: exiting _queue_task() for managed_node3/service 30582 1726855286.48164: done queuing things up, now waiting for results queue to drain 30582 1726855286.48166: waiting for pending results... 30582 1726855286.48347: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855286.48609: in run() - task 0affcc66-ac2b-aa83-7d57-00000000073a 30582 1726855286.48614: variable 'ansible_search_path' from source: unknown 30582 1726855286.48617: variable 'ansible_search_path' from source: unknown 30582 1726855286.48696: calling self._execute() 30582 1726855286.48758: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855286.48770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855286.48828: variable 'omit' from source: magic vars 30582 1726855286.49458: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.49462: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855286.49696: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855286.50117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855286.53106: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855286.53202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855286.53259: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855286.53313: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855286.53343: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855286.53548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.53552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.53573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.53666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.53692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.53764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.53797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.53826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.53884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.53907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.53961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.53995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.54024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.54079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.54102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.54298: variable 'network_connections' from source: include params 30582 1726855286.54398: variable 'interface' from source: play vars 30582 1726855286.54401: variable 'interface' from source: play vars 30582 1726855286.54476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855286.54660: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855286.55108: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855286.55142: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855286.55185: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855286.55239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855286.55280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855286.55321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.55349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855286.55426: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855286.55706: variable 'network_connections' from source: include params 30582 1726855286.55716: variable 'interface' from source: play vars 30582 1726855286.55815: variable 'interface' from source: play vars 30582 1726855286.55828: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855286.55836: when evaluation is False, skipping this task 30582 1726855286.55843: _execute() done 30582 1726855286.55849: dumping result to json 30582 1726855286.55855: done dumping result, returning 30582 1726855286.55891: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000073a] 30582 1726855286.55894: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073a 30582 1726855286.56135: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073a 30582 1726855286.56147: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855286.56200: no more pending results, returning what we have 30582 1726855286.56205: results queue empty 30582 1726855286.56206: checking for any_errors_fatal 30582 1726855286.56214: done checking for any_errors_fatal 30582 1726855286.56215: checking for max_fail_percentage 30582 1726855286.56217: done checking for max_fail_percentage 30582 1726855286.56218: checking to see if all hosts have failed and the running result is not ok 30582 1726855286.56219: done checking to see if all hosts have failed 30582 1726855286.56219: getting the remaining hosts for this loop 30582 1726855286.56221: done getting the remaining hosts for this loop 30582 1726855286.56225: getting the next task for host managed_node3 30582 1726855286.56234: done getting next task for host managed_node3 30582 1726855286.56238: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855286.56243: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855286.56263: getting variables 30582 1726855286.56265: in VariableManager get_vars() 30582 1726855286.56305: Calling all_inventory to load vars for managed_node3 30582 1726855286.56308: Calling groups_inventory to load vars for managed_node3 30582 1726855286.56310: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855286.56322: Calling all_plugins_play to load vars for managed_node3 30582 1726855286.56325: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855286.56327: Calling groups_plugins_play to load vars for managed_node3 30582 1726855286.58379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855286.60269: done with get_vars() 30582 1726855286.60509: done getting variables 30582 1726855286.60571: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:01:26 -0400 (0:00:00.130) 0:00:22.956 ****** 30582 1726855286.60621: entering _queue_task() for managed_node3/service 30582 1726855286.61493: worker is 1 (out of 1 available) 30582 1726855286.61508: exiting _queue_task() for managed_node3/service 30582 1726855286.61521: done queuing things up, now waiting for results queue to drain 30582 1726855286.61522: waiting for pending results... 30582 1726855286.61932: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855286.62006: in run() - task 0affcc66-ac2b-aa83-7d57-00000000073b 30582 1726855286.62025: variable 'ansible_search_path' from source: unknown 30582 1726855286.62029: variable 'ansible_search_path' from source: unknown 30582 1726855286.62063: calling self._execute() 30582 1726855286.62160: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855286.62164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855286.62177: variable 'omit' from source: magic vars 30582 1726855286.62558: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.62572: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855286.62738: variable 'network_provider' from source: set_fact 30582 1726855286.62742: variable 'network_state' from source: role '' defaults 30582 1726855286.62753: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855286.62759: variable 'omit' from source: magic vars 30582 1726855286.62823: variable 'omit' from source: magic vars 30582 1726855286.62849: variable 'network_service_name' from source: role '' defaults 30582 1726855286.62916: variable 'network_service_name' from source: role '' defaults 30582 1726855286.63017: variable '__network_provider_setup' from source: role '' defaults 30582 1726855286.63022: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855286.63086: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855286.63116: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855286.63159: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855286.63455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855286.66551: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855286.66695: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855286.66699: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855286.66712: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855286.66735: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855286.66826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.66848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.66942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.66946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.66948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.66973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.66996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.67025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.67066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.67079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.67660: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855286.67902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.67979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.68093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.68097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.68099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.68197: variable 'ansible_python' from source: facts 30582 1726855286.68201: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855286.68494: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855286.68582: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855286.68829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.68850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.68879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.69008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.69027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.69076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855286.69224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855286.69268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.69291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855286.69379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855286.69647: variable 'network_connections' from source: include params 30582 1726855286.69655: variable 'interface' from source: play vars 30582 1726855286.69732: variable 'interface' from source: play vars 30582 1726855286.70000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855286.70064: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855286.70118: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855286.70158: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855286.70200: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855286.70264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855286.70294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855286.70328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855286.70361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855286.70409: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855286.70702: variable 'network_connections' from source: include params 30582 1726855286.70708: variable 'interface' from source: play vars 30582 1726855286.70784: variable 'interface' from source: play vars 30582 1726855286.70834: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855286.70918: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855286.71258: variable 'network_connections' from source: include params 30582 1726855286.71262: variable 'interface' from source: play vars 30582 1726855286.71425: variable 'interface' from source: play vars 30582 1726855286.71481: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855286.71669: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855286.72331: variable 'network_connections' from source: include params 30582 1726855286.72335: variable 'interface' from source: play vars 30582 1726855286.72438: variable 'interface' from source: play vars 30582 1726855286.72544: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855286.72620: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855286.72628: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855286.72770: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855286.73397: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855286.73658: variable 'network_connections' from source: include params 30582 1726855286.73668: variable 'interface' from source: play vars 30582 1726855286.73729: variable 'interface' from source: play vars 30582 1726855286.73737: variable 'ansible_distribution' from source: facts 30582 1726855286.73740: variable '__network_rh_distros' from source: role '' defaults 30582 1726855286.73746: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.73781: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855286.73955: variable 'ansible_distribution' from source: facts 30582 1726855286.73959: variable '__network_rh_distros' from source: role '' defaults 30582 1726855286.73964: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.73976: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855286.74142: variable 'ansible_distribution' from source: facts 30582 1726855286.74145: variable '__network_rh_distros' from source: role '' defaults 30582 1726855286.74152: variable 'ansible_distribution_major_version' from source: facts 30582 1726855286.74185: variable 'network_provider' from source: set_fact 30582 1726855286.74215: variable 'omit' from source: magic vars 30582 1726855286.74241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855286.74290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855286.74308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855286.74332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855286.74343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855286.74380: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855286.74383: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855286.74385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855286.74555: Set connection var ansible_timeout to 10 30582 1726855286.74559: Set connection var ansible_connection to ssh 30582 1726855286.74561: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855286.74563: Set connection var ansible_pipelining to False 30582 1726855286.74565: Set connection var ansible_shell_executable to /bin/sh 30582 1726855286.74567: Set connection var ansible_shell_type to sh 30582 1726855286.74569: variable 'ansible_shell_executable' from source: unknown 30582 1726855286.74571: variable 'ansible_connection' from source: unknown 30582 1726855286.74586: variable 'ansible_module_compression' from source: unknown 30582 1726855286.74597: variable 'ansible_shell_type' from source: unknown 30582 1726855286.74600: variable 'ansible_shell_executable' from source: unknown 30582 1726855286.74602: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855286.74604: variable 'ansible_pipelining' from source: unknown 30582 1726855286.74606: variable 'ansible_timeout' from source: unknown 30582 1726855286.74608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855286.74812: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855286.74824: variable 'omit' from source: magic vars 30582 1726855286.74827: starting attempt loop 30582 1726855286.74829: running the handler 30582 1726855286.74883: variable 'ansible_facts' from source: unknown 30582 1726855286.75647: _low_level_execute_command(): starting 30582 1726855286.75681: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855286.76319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855286.76392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855286.76431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855286.76451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855286.76564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855286.78214: stdout chunk (state=3): >>>/root <<< 30582 1726855286.78306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855286.78341: stderr chunk (state=3): >>><<< 30582 1726855286.78343: stdout chunk (state=3): >>><<< 30582 1726855286.78354: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855286.78366: _low_level_execute_command(): starting 30582 1726855286.78431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307 `" && echo ansible-tmp-1726855286.7835891-31598-277323238499307="` echo /root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307 `" ) && sleep 0' 30582 1726855286.78861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855286.78890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855286.78912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855286.79014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855286.80932: stdout chunk (state=3): >>>ansible-tmp-1726855286.7835891-31598-277323238499307=/root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307 <<< 30582 1726855286.81034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855286.81060: stderr chunk (state=3): >>><<< 30582 1726855286.81065: stdout chunk (state=3): >>><<< 30582 1726855286.81083: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855286.7835891-31598-277323238499307=/root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855286.81112: variable 'ansible_module_compression' from source: unknown 30582 1726855286.81151: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855286.81207: variable 'ansible_facts' from source: unknown 30582 1726855286.81341: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/AnsiballZ_systemd.py 30582 1726855286.81446: Sending initial data 30582 1726855286.81449: Sent initial data (156 bytes) 30582 1726855286.81879: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855286.81884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855286.81911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855286.81914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855286.81921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855286.81981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855286.81993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855286.81995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855286.82052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855286.83686: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855286.83750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855286.83808: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpvlk_o7ef /root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/AnsiballZ_systemd.py <<< 30582 1726855286.83811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/AnsiballZ_systemd.py" <<< 30582 1726855286.83863: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30582 1726855286.83868: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpvlk_o7ef" to remote "/root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/AnsiballZ_systemd.py" <<< 30582 1726855286.85024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855286.85065: stderr chunk (state=3): >>><<< 30582 1726855286.85068: stdout chunk (state=3): >>><<< 30582 1726855286.85112: done transferring module to remote 30582 1726855286.85120: _low_level_execute_command(): starting 30582 1726855286.85125: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/ /root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/AnsiballZ_systemd.py && sleep 0' 30582 1726855286.85552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855286.85559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855286.85590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855286.85593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855286.85596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855286.85598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855286.85651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855286.85654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855286.85714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855286.87512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855286.87528: stderr chunk (state=3): >>><<< 30582 1726855286.87531: stdout chunk (state=3): >>><<< 30582 1726855286.87548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855286.87550: _low_level_execute_command(): starting 30582 1726855286.87555: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/AnsiballZ_systemd.py && sleep 0' 30582 1726855286.88062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855286.88066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855286.88068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855286.88070: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855286.88072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855286.88127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855286.88134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855286.88200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855287.17455: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10608640", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316731904", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2048955000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855287.17516: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855287.19344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855287.19419: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855287.19422: stdout chunk (state=3): >>><<< 30582 1726855287.19425: stderr chunk (state=3): >>><<< 30582 1726855287.19456: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10608640", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316731904", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2048955000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855287.19771: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855287.19777: _low_level_execute_command(): starting 30582 1726855287.19780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855286.7835891-31598-277323238499307/ > /dev/null 2>&1 && sleep 0' 30582 1726855287.20610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855287.20710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855287.20822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855287.20946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855287.21082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855287.22924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855287.22980: stderr chunk (state=3): >>><<< 30582 1726855287.22992: stdout chunk (state=3): >>><<< 30582 1726855287.23013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855287.23026: handler run complete 30582 1726855287.23097: attempt loop complete, returning result 30582 1726855287.23106: _execute() done 30582 1726855287.23113: dumping result to json 30582 1726855287.23137: done dumping result, returning 30582 1726855287.23152: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-00000000073b] 30582 1726855287.23161: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073b 30582 1726855287.23653: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073b 30582 1726855287.23657: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855287.23712: no more pending results, returning what we have 30582 1726855287.23715: results queue empty 30582 1726855287.23716: checking for any_errors_fatal 30582 1726855287.23721: done checking for any_errors_fatal 30582 1726855287.23721: checking for max_fail_percentage 30582 1726855287.23723: done checking for max_fail_percentage 30582 1726855287.23724: checking to see if all hosts have failed and the running result is not ok 30582 1726855287.23724: done checking to see if all hosts have failed 30582 1726855287.23725: getting the remaining hosts for this loop 30582 1726855287.23727: done getting the remaining hosts for this loop 30582 1726855287.23730: getting the next task for host managed_node3 30582 1726855287.23738: done getting next task for host managed_node3 30582 1726855287.23742: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855287.23747: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855287.23757: getting variables 30582 1726855287.23760: in VariableManager get_vars() 30582 1726855287.23789: Calling all_inventory to load vars for managed_node3 30582 1726855287.23791: Calling groups_inventory to load vars for managed_node3 30582 1726855287.23793: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855287.23803: Calling all_plugins_play to load vars for managed_node3 30582 1726855287.23805: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855287.23808: Calling groups_plugins_play to load vars for managed_node3 30582 1726855287.25108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855287.26695: done with get_vars() 30582 1726855287.26722: done getting variables 30582 1726855287.26790: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:01:27 -0400 (0:00:00.662) 0:00:23.618 ****** 30582 1726855287.26831: entering _queue_task() for managed_node3/service 30582 1726855287.27197: worker is 1 (out of 1 available) 30582 1726855287.27212: exiting _queue_task() for managed_node3/service 30582 1726855287.27224: done queuing things up, now waiting for results queue to drain 30582 1726855287.27226: waiting for pending results... 30582 1726855287.27607: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855287.27675: in run() - task 0affcc66-ac2b-aa83-7d57-00000000073c 30582 1726855287.27698: variable 'ansible_search_path' from source: unknown 30582 1726855287.27707: variable 'ansible_search_path' from source: unknown 30582 1726855287.27744: calling self._execute() 30582 1726855287.27840: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855287.27851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855287.27920: variable 'omit' from source: magic vars 30582 1726855287.28222: variable 'ansible_distribution_major_version' from source: facts 30582 1726855287.28242: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855287.28369: variable 'network_provider' from source: set_fact 30582 1726855287.28385: Evaluated conditional (network_provider == "nm"): True 30582 1726855287.28490: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855287.28589: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855287.28768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855287.31202: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855287.31292: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855287.31322: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855287.31362: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855287.31592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855287.31596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855287.31598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855287.31601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855287.31603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855287.31605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855287.31655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855287.31686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855287.31719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855287.31762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855287.31784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855287.31834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855287.31863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855287.31899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855287.31946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855287.31965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855287.32117: variable 'network_connections' from source: include params 30582 1726855287.32134: variable 'interface' from source: play vars 30582 1726855287.32215: variable 'interface' from source: play vars 30582 1726855287.32309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855287.32460: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855287.32512: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855287.32548: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855287.32594: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855287.32640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855287.32703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855287.32706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855287.32735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855287.32793: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855287.33056: variable 'network_connections' from source: include params 30582 1726855287.33068: variable 'interface' from source: play vars 30582 1726855287.33143: variable 'interface' from source: play vars 30582 1726855287.33247: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855287.33250: when evaluation is False, skipping this task 30582 1726855287.33252: _execute() done 30582 1726855287.33254: dumping result to json 30582 1726855287.33257: done dumping result, returning 30582 1726855287.33259: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-00000000073c] 30582 1726855287.33269: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073c 30582 1726855287.33493: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073c 30582 1726855287.33496: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855287.33545: no more pending results, returning what we have 30582 1726855287.33550: results queue empty 30582 1726855287.33551: checking for any_errors_fatal 30582 1726855287.33578: done checking for any_errors_fatal 30582 1726855287.33579: checking for max_fail_percentage 30582 1726855287.33581: done checking for max_fail_percentage 30582 1726855287.33582: checking to see if all hosts have failed and the running result is not ok 30582 1726855287.33583: done checking to see if all hosts have failed 30582 1726855287.33583: getting the remaining hosts for this loop 30582 1726855287.33585: done getting the remaining hosts for this loop 30582 1726855287.33591: getting the next task for host managed_node3 30582 1726855287.33601: done getting next task for host managed_node3 30582 1726855287.33605: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855287.33610: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855287.33628: getting variables 30582 1726855287.33630: in VariableManager get_vars() 30582 1726855287.33669: Calling all_inventory to load vars for managed_node3 30582 1726855287.33672: Calling groups_inventory to load vars for managed_node3 30582 1726855287.33678: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855287.33792: Calling all_plugins_play to load vars for managed_node3 30582 1726855287.33797: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855287.33800: Calling groups_plugins_play to load vars for managed_node3 30582 1726855287.35341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855287.36896: done with get_vars() 30582 1726855287.36921: done getting variables 30582 1726855287.36981: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:01:27 -0400 (0:00:00.101) 0:00:23.720 ****** 30582 1726855287.37020: entering _queue_task() for managed_node3/service 30582 1726855287.37368: worker is 1 (out of 1 available) 30582 1726855287.37385: exiting _queue_task() for managed_node3/service 30582 1726855287.37398: done queuing things up, now waiting for results queue to drain 30582 1726855287.37400: waiting for pending results... 30582 1726855287.37711: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855287.37879: in run() - task 0affcc66-ac2b-aa83-7d57-00000000073d 30582 1726855287.37903: variable 'ansible_search_path' from source: unknown 30582 1726855287.37917: variable 'ansible_search_path' from source: unknown 30582 1726855287.37959: calling self._execute() 30582 1726855287.38061: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855287.38132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855287.38135: variable 'omit' from source: magic vars 30582 1726855287.38505: variable 'ansible_distribution_major_version' from source: facts 30582 1726855287.38522: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855287.38642: variable 'network_provider' from source: set_fact 30582 1726855287.38654: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855287.38663: when evaluation is False, skipping this task 30582 1726855287.38678: _execute() done 30582 1726855287.38688: dumping result to json 30582 1726855287.38697: done dumping result, returning 30582 1726855287.38709: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-00000000073d] 30582 1726855287.38782: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073d 30582 1726855287.38851: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073d 30582 1726855287.38858: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855287.38934: no more pending results, returning what we have 30582 1726855287.38939: results queue empty 30582 1726855287.38941: checking for any_errors_fatal 30582 1726855287.38952: done checking for any_errors_fatal 30582 1726855287.38953: checking for max_fail_percentage 30582 1726855287.38955: done checking for max_fail_percentage 30582 1726855287.38956: checking to see if all hosts have failed and the running result is not ok 30582 1726855287.38957: done checking to see if all hosts have failed 30582 1726855287.38957: getting the remaining hosts for this loop 30582 1726855287.38959: done getting the remaining hosts for this loop 30582 1726855287.38964: getting the next task for host managed_node3 30582 1726855287.38977: done getting next task for host managed_node3 30582 1726855287.38981: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855287.38989: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855287.39010: getting variables 30582 1726855287.39012: in VariableManager get_vars() 30582 1726855287.39046: Calling all_inventory to load vars for managed_node3 30582 1726855287.39049: Calling groups_inventory to load vars for managed_node3 30582 1726855287.39051: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855287.39063: Calling all_plugins_play to load vars for managed_node3 30582 1726855287.39066: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855287.39068: Calling groups_plugins_play to load vars for managed_node3 30582 1726855287.41454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855287.43390: done with get_vars() 30582 1726855287.43414: done getting variables 30582 1726855287.43481: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:01:27 -0400 (0:00:00.064) 0:00:23.785 ****** 30582 1726855287.43521: entering _queue_task() for managed_node3/copy 30582 1726855287.44237: worker is 1 (out of 1 available) 30582 1726855287.44249: exiting _queue_task() for managed_node3/copy 30582 1726855287.44262: done queuing things up, now waiting for results queue to drain 30582 1726855287.44263: waiting for pending results... 30582 1726855287.44840: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855287.45096: in run() - task 0affcc66-ac2b-aa83-7d57-00000000073e 30582 1726855287.45101: variable 'ansible_search_path' from source: unknown 30582 1726855287.45104: variable 'ansible_search_path' from source: unknown 30582 1726855287.45107: calling self._execute() 30582 1726855287.45109: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855287.45113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855287.45115: variable 'omit' from source: magic vars 30582 1726855287.45437: variable 'ansible_distribution_major_version' from source: facts 30582 1726855287.45447: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855287.45571: variable 'network_provider' from source: set_fact 30582 1726855287.45580: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855287.45583: when evaluation is False, skipping this task 30582 1726855287.45586: _execute() done 30582 1726855287.45591: dumping result to json 30582 1726855287.45593: done dumping result, returning 30582 1726855287.45604: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-00000000073e] 30582 1726855287.45607: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073e skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855287.45750: no more pending results, returning what we have 30582 1726855287.45755: results queue empty 30582 1726855287.45756: checking for any_errors_fatal 30582 1726855287.45763: done checking for any_errors_fatal 30582 1726855287.45764: checking for max_fail_percentage 30582 1726855287.45766: done checking for max_fail_percentage 30582 1726855287.45767: checking to see if all hosts have failed and the running result is not ok 30582 1726855287.45767: done checking to see if all hosts have failed 30582 1726855287.45768: getting the remaining hosts for this loop 30582 1726855287.45770: done getting the remaining hosts for this loop 30582 1726855287.45777: getting the next task for host managed_node3 30582 1726855287.45788: done getting next task for host managed_node3 30582 1726855287.45792: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855287.45798: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855287.45819: getting variables 30582 1726855287.45821: in VariableManager get_vars() 30582 1726855287.45857: Calling all_inventory to load vars for managed_node3 30582 1726855287.45860: Calling groups_inventory to load vars for managed_node3 30582 1726855287.45864: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855287.45879: Calling all_plugins_play to load vars for managed_node3 30582 1726855287.45882: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855287.45885: Calling groups_plugins_play to load vars for managed_node3 30582 1726855287.46609: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073e 30582 1726855287.46614: WORKER PROCESS EXITING 30582 1726855287.48349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855287.51664: done with get_vars() 30582 1726855287.51704: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:01:27 -0400 (0:00:00.083) 0:00:23.868 ****** 30582 1726855287.51909: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855287.52430: worker is 1 (out of 1 available) 30582 1726855287.52444: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855287.52457: done queuing things up, now waiting for results queue to drain 30582 1726855287.52459: waiting for pending results... 30582 1726855287.52746: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855287.52994: in run() - task 0affcc66-ac2b-aa83-7d57-00000000073f 30582 1726855287.52999: variable 'ansible_search_path' from source: unknown 30582 1726855287.53002: variable 'ansible_search_path' from source: unknown 30582 1726855287.53005: calling self._execute() 30582 1726855287.53048: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855287.53057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855287.53069: variable 'omit' from source: magic vars 30582 1726855287.53469: variable 'ansible_distribution_major_version' from source: facts 30582 1726855287.53496: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855287.53499: variable 'omit' from source: magic vars 30582 1726855287.53589: variable 'omit' from source: magic vars 30582 1726855287.53895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855287.56496: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855287.56639: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855287.56686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855287.56835: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855287.56867: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855287.57003: variable 'network_provider' from source: set_fact 30582 1726855287.57288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855287.57425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855287.57457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855287.57543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855287.57611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855287.57797: variable 'omit' from source: magic vars 30582 1726855287.57985: variable 'omit' from source: magic vars 30582 1726855287.58298: variable 'network_connections' from source: include params 30582 1726855287.58313: variable 'interface' from source: play vars 30582 1726855287.58371: variable 'interface' from source: play vars 30582 1726855287.58762: variable 'omit' from source: magic vars 30582 1726855287.58778: variable '__lsr_ansible_managed' from source: task vars 30582 1726855287.59094: variable '__lsr_ansible_managed' from source: task vars 30582 1726855287.60122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855287.60567: Loaded config def from plugin (lookup/template) 30582 1726855287.60582: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855287.60624: File lookup term: get_ansible_managed.j2 30582 1726855287.60633: variable 'ansible_search_path' from source: unknown 30582 1726855287.60700: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855287.60716: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855287.60741: variable 'ansible_search_path' from source: unknown 30582 1726855287.71645: variable 'ansible_managed' from source: unknown 30582 1726855287.72018: variable 'omit' from source: magic vars 30582 1726855287.72060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855287.72123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855287.72179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855287.72375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855287.72379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855287.72382: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855287.72385: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855287.72389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855287.72556: Set connection var ansible_timeout to 10 30582 1726855287.72894: Set connection var ansible_connection to ssh 30582 1726855287.72897: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855287.72899: Set connection var ansible_pipelining to False 30582 1726855287.72901: Set connection var ansible_shell_executable to /bin/sh 30582 1726855287.72903: Set connection var ansible_shell_type to sh 30582 1726855287.72906: variable 'ansible_shell_executable' from source: unknown 30582 1726855287.72908: variable 'ansible_connection' from source: unknown 30582 1726855287.72910: variable 'ansible_module_compression' from source: unknown 30582 1726855287.72912: variable 'ansible_shell_type' from source: unknown 30582 1726855287.72914: variable 'ansible_shell_executable' from source: unknown 30582 1726855287.72916: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855287.72918: variable 'ansible_pipelining' from source: unknown 30582 1726855287.72920: variable 'ansible_timeout' from source: unknown 30582 1726855287.72922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855287.73042: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855287.73124: variable 'omit' from source: magic vars 30582 1726855287.73136: starting attempt loop 30582 1726855287.73142: running the handler 30582 1726855287.73158: _low_level_execute_command(): starting 30582 1726855287.73227: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855287.74765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855287.74812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855287.74870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855287.74978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855287.75085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855287.76777: stdout chunk (state=3): >>>/root <<< 30582 1726855287.77151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855287.77163: stdout chunk (state=3): >>><<< 30582 1726855287.77179: stderr chunk (state=3): >>><<< 30582 1726855287.77360: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855287.77364: _low_level_execute_command(): starting 30582 1726855287.77367: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234 `" && echo ansible-tmp-1726855287.773172-31648-57236944796234="` echo /root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234 `" ) && sleep 0' 30582 1726855287.78910: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855287.78954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855287.79039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855287.79057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855287.79119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855287.79122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855287.79286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855287.79397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855287.81264: stdout chunk (state=3): >>>ansible-tmp-1726855287.773172-31648-57236944796234=/root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234 <<< 30582 1726855287.81446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855287.81477: stderr chunk (state=3): >>><<< 30582 1726855287.81480: stdout chunk (state=3): >>><<< 30582 1726855287.81500: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855287.773172-31648-57236944796234=/root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855287.81553: variable 'ansible_module_compression' from source: unknown 30582 1726855287.81796: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855287.81805: variable 'ansible_facts' from source: unknown 30582 1726855287.82054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/AnsiballZ_network_connections.py 30582 1726855287.82358: Sending initial data 30582 1726855287.82368: Sent initial data (166 bytes) 30582 1726855287.83027: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855287.83050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855287.83104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855287.83193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855287.83213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855287.83236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855287.83470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855287.85023: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855287.85180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855287.85248: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp51nvezw1 /root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/AnsiballZ_network_connections.py <<< 30582 1726855287.85251: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/AnsiballZ_network_connections.py" <<< 30582 1726855287.85389: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp51nvezw1" to remote "/root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/AnsiballZ_network_connections.py" <<< 30582 1726855287.87200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855287.87204: stdout chunk (state=3): >>><<< 30582 1726855287.87207: stderr chunk (state=3): >>><<< 30582 1726855287.87209: done transferring module to remote 30582 1726855287.87210: _low_level_execute_command(): starting 30582 1726855287.87212: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/ /root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/AnsiballZ_network_connections.py && sleep 0' 30582 1726855287.88457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855287.88472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855287.88501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855287.88519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855287.88536: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855287.88645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855287.88893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855287.88960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855287.90847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855287.90860: stdout chunk (state=3): >>><<< 30582 1726855287.90879: stderr chunk (state=3): >>><<< 30582 1726855287.90902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855287.90915: _low_level_execute_command(): starting 30582 1726855287.90924: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/AnsiballZ_network_connections.py && sleep 0' 30582 1726855287.91578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855287.91600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855287.91693: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855287.91715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855287.91730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855287.91927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855288.17914: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32e4a47f-d12d-469b-92d8-81cf9f125a33\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855288.19769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855288.19788: stdout chunk (state=3): >>><<< 30582 1726855288.20044: stderr chunk (state=3): >>><<< 30582 1726855288.20048: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32e4a47f-d12d-469b-92d8-81cf9f125a33\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855288.20051: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'autoconnect': False, 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855288.20054: _low_level_execute_command(): starting 30582 1726855288.20056: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855287.773172-31648-57236944796234/ > /dev/null 2>&1 && sleep 0' 30582 1726855288.20693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855288.20716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855288.20732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855288.20749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855288.20764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855288.20777: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855288.20805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855288.20904: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855288.20936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855288.21154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855288.22885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855288.22927: stderr chunk (state=3): >>><<< 30582 1726855288.22945: stdout chunk (state=3): >>><<< 30582 1726855288.22970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855288.22981: handler run complete 30582 1726855288.23024: attempt loop complete, returning result 30582 1726855288.23036: _execute() done 30582 1726855288.23045: dumping result to json 30582 1726855288.23057: done dumping result, returning 30582 1726855288.23070: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-00000000073f] 30582 1726855288.23081: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073f 30582 1726855288.23265: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000073f 30582 1726855288.23269: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32e4a47f-d12d-469b-92d8-81cf9f125a33 30582 1726855288.23398: no more pending results, returning what we have 30582 1726855288.23402: results queue empty 30582 1726855288.23403: checking for any_errors_fatal 30582 1726855288.23411: done checking for any_errors_fatal 30582 1726855288.23412: checking for max_fail_percentage 30582 1726855288.23414: done checking for max_fail_percentage 30582 1726855288.23415: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.23416: done checking to see if all hosts have failed 30582 1726855288.23416: getting the remaining hosts for this loop 30582 1726855288.23418: done getting the remaining hosts for this loop 30582 1726855288.23422: getting the next task for host managed_node3 30582 1726855288.23430: done getting next task for host managed_node3 30582 1726855288.23434: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855288.23438: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.23450: getting variables 30582 1726855288.23451: in VariableManager get_vars() 30582 1726855288.23691: Calling all_inventory to load vars for managed_node3 30582 1726855288.23696: Calling groups_inventory to load vars for managed_node3 30582 1726855288.23699: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.23715: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.23718: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.23722: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.25358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.26805: done with get_vars() 30582 1726855288.26836: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:01:28 -0400 (0:00:00.750) 0:00:24.619 ****** 30582 1726855288.26953: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855288.27334: worker is 1 (out of 1 available) 30582 1726855288.27347: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855288.27359: done queuing things up, now waiting for results queue to drain 30582 1726855288.27361: waiting for pending results... 30582 1726855288.27711: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855288.27773: in run() - task 0affcc66-ac2b-aa83-7d57-000000000740 30582 1726855288.27777: variable 'ansible_search_path' from source: unknown 30582 1726855288.27780: variable 'ansible_search_path' from source: unknown 30582 1726855288.27791: calling self._execute() 30582 1726855288.27870: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.27874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.27884: variable 'omit' from source: magic vars 30582 1726855288.28192: variable 'ansible_distribution_major_version' from source: facts 30582 1726855288.28203: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855288.28294: variable 'network_state' from source: role '' defaults 30582 1726855288.28308: Evaluated conditional (network_state != {}): False 30582 1726855288.28311: when evaluation is False, skipping this task 30582 1726855288.28314: _execute() done 30582 1726855288.28316: dumping result to json 30582 1726855288.28318: done dumping result, returning 30582 1726855288.28322: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-000000000740] 30582 1726855288.28327: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000740 30582 1726855288.28420: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000740 30582 1726855288.28422: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855288.28470: no more pending results, returning what we have 30582 1726855288.28475: results queue empty 30582 1726855288.28476: checking for any_errors_fatal 30582 1726855288.28487: done checking for any_errors_fatal 30582 1726855288.28489: checking for max_fail_percentage 30582 1726855288.28491: done checking for max_fail_percentage 30582 1726855288.28492: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.28492: done checking to see if all hosts have failed 30582 1726855288.28493: getting the remaining hosts for this loop 30582 1726855288.28495: done getting the remaining hosts for this loop 30582 1726855288.28499: getting the next task for host managed_node3 30582 1726855288.28507: done getting next task for host managed_node3 30582 1726855288.28511: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855288.28516: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.28535: getting variables 30582 1726855288.28537: in VariableManager get_vars() 30582 1726855288.28571: Calling all_inventory to load vars for managed_node3 30582 1726855288.28574: Calling groups_inventory to load vars for managed_node3 30582 1726855288.28576: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.28591: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.28594: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.28597: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.29436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.30819: done with get_vars() 30582 1726855288.30844: done getting variables 30582 1726855288.30907: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:01:28 -0400 (0:00:00.039) 0:00:24.659 ****** 30582 1726855288.30943: entering _queue_task() for managed_node3/debug 30582 1726855288.31291: worker is 1 (out of 1 available) 30582 1726855288.31303: exiting _queue_task() for managed_node3/debug 30582 1726855288.31316: done queuing things up, now waiting for results queue to drain 30582 1726855288.31317: waiting for pending results... 30582 1726855288.31604: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855288.31696: in run() - task 0affcc66-ac2b-aa83-7d57-000000000741 30582 1726855288.31711: variable 'ansible_search_path' from source: unknown 30582 1726855288.31715: variable 'ansible_search_path' from source: unknown 30582 1726855288.31744: calling self._execute() 30582 1726855288.31821: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.31825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.31832: variable 'omit' from source: magic vars 30582 1726855288.32112: variable 'ansible_distribution_major_version' from source: facts 30582 1726855288.32124: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855288.32128: variable 'omit' from source: magic vars 30582 1726855288.32174: variable 'omit' from source: magic vars 30582 1726855288.32200: variable 'omit' from source: magic vars 30582 1726855288.32232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855288.32262: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855288.32280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855288.32295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855288.32308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855288.32329: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855288.32332: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.32335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.32414: Set connection var ansible_timeout to 10 30582 1726855288.32417: Set connection var ansible_connection to ssh 30582 1726855288.32422: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855288.32425: Set connection var ansible_pipelining to False 30582 1726855288.32431: Set connection var ansible_shell_executable to /bin/sh 30582 1726855288.32434: Set connection var ansible_shell_type to sh 30582 1726855288.32450: variable 'ansible_shell_executable' from source: unknown 30582 1726855288.32453: variable 'ansible_connection' from source: unknown 30582 1726855288.32457: variable 'ansible_module_compression' from source: unknown 30582 1726855288.32460: variable 'ansible_shell_type' from source: unknown 30582 1726855288.32462: variable 'ansible_shell_executable' from source: unknown 30582 1726855288.32464: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.32466: variable 'ansible_pipelining' from source: unknown 30582 1726855288.32468: variable 'ansible_timeout' from source: unknown 30582 1726855288.32470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.32573: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855288.32589: variable 'omit' from source: magic vars 30582 1726855288.32592: starting attempt loop 30582 1726855288.32595: running the handler 30582 1726855288.32691: variable '__network_connections_result' from source: set_fact 30582 1726855288.32732: handler run complete 30582 1726855288.32748: attempt loop complete, returning result 30582 1726855288.32751: _execute() done 30582 1726855288.32754: dumping result to json 30582 1726855288.32757: done dumping result, returning 30582 1726855288.32763: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000000741] 30582 1726855288.32768: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000741 30582 1726855288.32849: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000741 30582 1726855288.32854: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32e4a47f-d12d-469b-92d8-81cf9f125a33" ] } 30582 1726855288.32921: no more pending results, returning what we have 30582 1726855288.32924: results queue empty 30582 1726855288.32925: checking for any_errors_fatal 30582 1726855288.32932: done checking for any_errors_fatal 30582 1726855288.32932: checking for max_fail_percentage 30582 1726855288.32934: done checking for max_fail_percentage 30582 1726855288.32935: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.32936: done checking to see if all hosts have failed 30582 1726855288.32936: getting the remaining hosts for this loop 30582 1726855288.32938: done getting the remaining hosts for this loop 30582 1726855288.32941: getting the next task for host managed_node3 30582 1726855288.32950: done getting next task for host managed_node3 30582 1726855288.32953: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855288.32958: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.32975: getting variables 30582 1726855288.32977: in VariableManager get_vars() 30582 1726855288.33012: Calling all_inventory to load vars for managed_node3 30582 1726855288.33015: Calling groups_inventory to load vars for managed_node3 30582 1726855288.33017: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.33026: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.33029: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.33031: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.34370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.35225: done with get_vars() 30582 1726855288.35242: done getting variables 30582 1726855288.35290: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:01:28 -0400 (0:00:00.043) 0:00:24.703 ****** 30582 1726855288.35322: entering _queue_task() for managed_node3/debug 30582 1726855288.35576: worker is 1 (out of 1 available) 30582 1726855288.35593: exiting _queue_task() for managed_node3/debug 30582 1726855288.35605: done queuing things up, now waiting for results queue to drain 30582 1726855288.35607: waiting for pending results... 30582 1726855288.35794: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855288.35889: in run() - task 0affcc66-ac2b-aa83-7d57-000000000742 30582 1726855288.35902: variable 'ansible_search_path' from source: unknown 30582 1726855288.35905: variable 'ansible_search_path' from source: unknown 30582 1726855288.35932: calling self._execute() 30582 1726855288.36006: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.36010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.36017: variable 'omit' from source: magic vars 30582 1726855288.36291: variable 'ansible_distribution_major_version' from source: facts 30582 1726855288.36300: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855288.36306: variable 'omit' from source: magic vars 30582 1726855288.36342: variable 'omit' from source: magic vars 30582 1726855288.36365: variable 'omit' from source: magic vars 30582 1726855288.36399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855288.36425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855288.36441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855288.36454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855288.36463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855288.36491: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855288.36494: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.36497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.36564: Set connection var ansible_timeout to 10 30582 1726855288.36567: Set connection var ansible_connection to ssh 30582 1726855288.36575: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855288.36577: Set connection var ansible_pipelining to False 30582 1726855288.36582: Set connection var ansible_shell_executable to /bin/sh 30582 1726855288.36584: Set connection var ansible_shell_type to sh 30582 1726855288.36604: variable 'ansible_shell_executable' from source: unknown 30582 1726855288.36609: variable 'ansible_connection' from source: unknown 30582 1726855288.36612: variable 'ansible_module_compression' from source: unknown 30582 1726855288.36614: variable 'ansible_shell_type' from source: unknown 30582 1726855288.36616: variable 'ansible_shell_executable' from source: unknown 30582 1726855288.36618: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.36620: variable 'ansible_pipelining' from source: unknown 30582 1726855288.36622: variable 'ansible_timeout' from source: unknown 30582 1726855288.36627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.36730: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855288.36740: variable 'omit' from source: magic vars 30582 1726855288.36745: starting attempt loop 30582 1726855288.36748: running the handler 30582 1726855288.36786: variable '__network_connections_result' from source: set_fact 30582 1726855288.36845: variable '__network_connections_result' from source: set_fact 30582 1726855288.36924: handler run complete 30582 1726855288.36942: attempt loop complete, returning result 30582 1726855288.36945: _execute() done 30582 1726855288.36952: dumping result to json 30582 1726855288.36955: done dumping result, returning 30582 1726855288.36960: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000000742] 30582 1726855288.36964: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000742 30582 1726855288.37050: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000742 30582 1726855288.37053: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32e4a47f-d12d-469b-92d8-81cf9f125a33\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32e4a47f-d12d-469b-92d8-81cf9f125a33" ] } } 30582 1726855288.37142: no more pending results, returning what we have 30582 1726855288.37147: results queue empty 30582 1726855288.37148: checking for any_errors_fatal 30582 1726855288.37156: done checking for any_errors_fatal 30582 1726855288.37156: checking for max_fail_percentage 30582 1726855288.37159: done checking for max_fail_percentage 30582 1726855288.37159: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.37160: done checking to see if all hosts have failed 30582 1726855288.37161: getting the remaining hosts for this loop 30582 1726855288.37162: done getting the remaining hosts for this loop 30582 1726855288.37167: getting the next task for host managed_node3 30582 1726855288.37177: done getting next task for host managed_node3 30582 1726855288.37181: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855288.37184: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.37196: getting variables 30582 1726855288.37203: in VariableManager get_vars() 30582 1726855288.37232: Calling all_inventory to load vars for managed_node3 30582 1726855288.37235: Calling groups_inventory to load vars for managed_node3 30582 1726855288.37237: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.37245: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.37247: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.37249: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.38030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.39011: done with get_vars() 30582 1726855288.39029: done getting variables 30582 1726855288.39075: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:01:28 -0400 (0:00:00.037) 0:00:24.740 ****** 30582 1726855288.39102: entering _queue_task() for managed_node3/debug 30582 1726855288.39356: worker is 1 (out of 1 available) 30582 1726855288.39370: exiting _queue_task() for managed_node3/debug 30582 1726855288.39384: done queuing things up, now waiting for results queue to drain 30582 1726855288.39386: waiting for pending results... 30582 1726855288.39575: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855288.39676: in run() - task 0affcc66-ac2b-aa83-7d57-000000000743 30582 1726855288.39685: variable 'ansible_search_path' from source: unknown 30582 1726855288.39692: variable 'ansible_search_path' from source: unknown 30582 1726855288.39718: calling self._execute() 30582 1726855288.39786: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.39791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.39799: variable 'omit' from source: magic vars 30582 1726855288.40070: variable 'ansible_distribution_major_version' from source: facts 30582 1726855288.40080: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855288.40167: variable 'network_state' from source: role '' defaults 30582 1726855288.40171: Evaluated conditional (network_state != {}): False 30582 1726855288.40176: when evaluation is False, skipping this task 30582 1726855288.40179: _execute() done 30582 1726855288.40181: dumping result to json 30582 1726855288.40184: done dumping result, returning 30582 1726855288.40192: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-000000000743] 30582 1726855288.40197: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000743 30582 1726855288.40282: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000743 30582 1726855288.40284: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855288.40336: no more pending results, returning what we have 30582 1726855288.40340: results queue empty 30582 1726855288.40341: checking for any_errors_fatal 30582 1726855288.40350: done checking for any_errors_fatal 30582 1726855288.40351: checking for max_fail_percentage 30582 1726855288.40353: done checking for max_fail_percentage 30582 1726855288.40354: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.40354: done checking to see if all hosts have failed 30582 1726855288.40355: getting the remaining hosts for this loop 30582 1726855288.40357: done getting the remaining hosts for this loop 30582 1726855288.40360: getting the next task for host managed_node3 30582 1726855288.40369: done getting next task for host managed_node3 30582 1726855288.40376: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855288.40380: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.40400: getting variables 30582 1726855288.40407: in VariableManager get_vars() 30582 1726855288.40438: Calling all_inventory to load vars for managed_node3 30582 1726855288.40440: Calling groups_inventory to load vars for managed_node3 30582 1726855288.40442: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.40451: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.40454: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.40456: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.44605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.45460: done with get_vars() 30582 1726855288.45481: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:01:28 -0400 (0:00:00.064) 0:00:24.805 ****** 30582 1726855288.45544: entering _queue_task() for managed_node3/ping 30582 1726855288.45819: worker is 1 (out of 1 available) 30582 1726855288.45833: exiting _queue_task() for managed_node3/ping 30582 1726855288.45845: done queuing things up, now waiting for results queue to drain 30582 1726855288.45846: waiting for pending results... 30582 1726855288.46034: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855288.46144: in run() - task 0affcc66-ac2b-aa83-7d57-000000000744 30582 1726855288.46155: variable 'ansible_search_path' from source: unknown 30582 1726855288.46159: variable 'ansible_search_path' from source: unknown 30582 1726855288.46192: calling self._execute() 30582 1726855288.46261: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.46267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.46278: variable 'omit' from source: magic vars 30582 1726855288.46553: variable 'ansible_distribution_major_version' from source: facts 30582 1726855288.46563: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855288.46569: variable 'omit' from source: magic vars 30582 1726855288.46611: variable 'omit' from source: magic vars 30582 1726855288.46636: variable 'omit' from source: magic vars 30582 1726855288.46668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855288.46698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855288.46717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855288.46732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855288.46742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855288.46767: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855288.46770: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.46775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.46847: Set connection var ansible_timeout to 10 30582 1726855288.46851: Set connection var ansible_connection to ssh 30582 1726855288.46856: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855288.46862: Set connection var ansible_pipelining to False 30582 1726855288.46867: Set connection var ansible_shell_executable to /bin/sh 30582 1726855288.46869: Set connection var ansible_shell_type to sh 30582 1726855288.46890: variable 'ansible_shell_executable' from source: unknown 30582 1726855288.46893: variable 'ansible_connection' from source: unknown 30582 1726855288.46896: variable 'ansible_module_compression' from source: unknown 30582 1726855288.46899: variable 'ansible_shell_type' from source: unknown 30582 1726855288.46901: variable 'ansible_shell_executable' from source: unknown 30582 1726855288.46903: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.46906: variable 'ansible_pipelining' from source: unknown 30582 1726855288.46908: variable 'ansible_timeout' from source: unknown 30582 1726855288.46912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.47062: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855288.47071: variable 'omit' from source: magic vars 30582 1726855288.47077: starting attempt loop 30582 1726855288.47080: running the handler 30582 1726855288.47092: _low_level_execute_command(): starting 30582 1726855288.47099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855288.47586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855288.47627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855288.47630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855288.47633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855288.47636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855288.47676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855288.47680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855288.47693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855288.47768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855288.49465: stdout chunk (state=3): >>>/root <<< 30582 1726855288.49568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855288.49600: stderr chunk (state=3): >>><<< 30582 1726855288.49603: stdout chunk (state=3): >>><<< 30582 1726855288.49625: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855288.49636: _low_level_execute_command(): starting 30582 1726855288.49642: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005 `" && echo ansible-tmp-1726855288.4962308-31719-256862050353005="` echo /root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005 `" ) && sleep 0' 30582 1726855288.50061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855288.50092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855288.50097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855288.50107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855288.50109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855288.50154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855288.50157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855288.50163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855288.50222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855288.52134: stdout chunk (state=3): >>>ansible-tmp-1726855288.4962308-31719-256862050353005=/root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005 <<< 30582 1726855288.52235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855288.52259: stderr chunk (state=3): >>><<< 30582 1726855288.52264: stdout chunk (state=3): >>><<< 30582 1726855288.52281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855288.4962308-31719-256862050353005=/root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855288.52323: variable 'ansible_module_compression' from source: unknown 30582 1726855288.52354: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855288.52383: variable 'ansible_facts' from source: unknown 30582 1726855288.52440: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/AnsiballZ_ping.py 30582 1726855288.52539: Sending initial data 30582 1726855288.52542: Sent initial data (153 bytes) 30582 1726855288.52951: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855288.52961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855288.52983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855288.52986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855288.53084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855288.53159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855288.54709: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855288.54804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855288.54879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpqgh_g5ze /root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/AnsiballZ_ping.py <<< 30582 1726855288.54882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/AnsiballZ_ping.py" <<< 30582 1726855288.54930: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpqgh_g5ze" to remote "/root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/AnsiballZ_ping.py" <<< 30582 1726855288.55739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855288.55763: stderr chunk (state=3): >>><<< 30582 1726855288.55849: stdout chunk (state=3): >>><<< 30582 1726855288.55852: done transferring module to remote 30582 1726855288.55855: _low_level_execute_command(): starting 30582 1726855288.55857: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/ /root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/AnsiballZ_ping.py && sleep 0' 30582 1726855288.56468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855288.56490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855288.56513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855288.56615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855288.56649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855288.56669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855288.56698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855288.56792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855288.58630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855288.58633: stdout chunk (state=3): >>><<< 30582 1726855288.58693: stderr chunk (state=3): >>><<< 30582 1726855288.58696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855288.58699: _low_level_execute_command(): starting 30582 1726855288.58701: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/AnsiballZ_ping.py && sleep 0' 30582 1726855288.59316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855288.59404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855288.59430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855288.59441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855288.59536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855288.74432: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855288.75850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855288.75854: stdout chunk (state=3): >>><<< 30582 1726855288.75856: stderr chunk (state=3): >>><<< 30582 1726855288.76025: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855288.76030: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855288.76033: _low_level_execute_command(): starting 30582 1726855288.76035: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855288.4962308-31719-256862050353005/ > /dev/null 2>&1 && sleep 0' 30582 1726855288.76709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855288.76713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855288.76715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855288.76827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855288.76864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855288.76947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855288.78873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855288.78877: stderr chunk (state=3): >>><<< 30582 1726855288.78880: stdout chunk (state=3): >>><<< 30582 1726855288.78897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855288.78990: handler run complete 30582 1726855288.79031: attempt loop complete, returning result 30582 1726855288.79038: _execute() done 30582 1726855288.79041: dumping result to json 30582 1726855288.79045: done dumping result, returning 30582 1726855288.79047: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-000000000744] 30582 1726855288.79049: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000744 30582 1726855288.79114: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000744 30582 1726855288.79116: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855288.79408: no more pending results, returning what we have 30582 1726855288.79412: results queue empty 30582 1726855288.79413: checking for any_errors_fatal 30582 1726855288.79420: done checking for any_errors_fatal 30582 1726855288.79420: checking for max_fail_percentage 30582 1726855288.79422: done checking for max_fail_percentage 30582 1726855288.79423: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.79424: done checking to see if all hosts have failed 30582 1726855288.79424: getting the remaining hosts for this loop 30582 1726855288.79426: done getting the remaining hosts for this loop 30582 1726855288.79429: getting the next task for host managed_node3 30582 1726855288.79438: done getting next task for host managed_node3 30582 1726855288.79440: ^ task is: TASK: meta (role_complete) 30582 1726855288.79444: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.79454: getting variables 30582 1726855288.79456: in VariableManager get_vars() 30582 1726855288.79505: Calling all_inventory to load vars for managed_node3 30582 1726855288.79508: Calling groups_inventory to load vars for managed_node3 30582 1726855288.79510: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.79519: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.79524: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.79527: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.81553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.82547: done with get_vars() 30582 1726855288.82563: done getting variables 30582 1726855288.82625: done queuing things up, now waiting for results queue to drain 30582 1726855288.82627: results queue empty 30582 1726855288.82627: checking for any_errors_fatal 30582 1726855288.82629: done checking for any_errors_fatal 30582 1726855288.82630: checking for max_fail_percentage 30582 1726855288.82630: done checking for max_fail_percentage 30582 1726855288.82631: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.82631: done checking to see if all hosts have failed 30582 1726855288.82632: getting the remaining hosts for this loop 30582 1726855288.82632: done getting the remaining hosts for this loop 30582 1726855288.82634: getting the next task for host managed_node3 30582 1726855288.82637: done getting next task for host managed_node3 30582 1726855288.82639: ^ task is: TASK: Show result 30582 1726855288.82640: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.82642: getting variables 30582 1726855288.82643: in VariableManager get_vars() 30582 1726855288.82649: Calling all_inventory to load vars for managed_node3 30582 1726855288.82651: Calling groups_inventory to load vars for managed_node3 30582 1726855288.82652: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.82655: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.82657: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.82658: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.83881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.85128: done with get_vars() 30582 1726855288.85149: done getting variables 30582 1726855288.85182: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:15 Friday 20 September 2024 14:01:28 -0400 (0:00:00.396) 0:00:25.202 ****** 30582 1726855288.85208: entering _queue_task() for managed_node3/debug 30582 1726855288.85468: worker is 1 (out of 1 available) 30582 1726855288.85481: exiting _queue_task() for managed_node3/debug 30582 1726855288.85495: done queuing things up, now waiting for results queue to drain 30582 1726855288.85497: waiting for pending results... 30582 1726855288.85682: running TaskExecutor() for managed_node3/TASK: Show result 30582 1726855288.85771: in run() - task 0affcc66-ac2b-aa83-7d57-0000000006b2 30582 1726855288.85785: variable 'ansible_search_path' from source: unknown 30582 1726855288.85791: variable 'ansible_search_path' from source: unknown 30582 1726855288.85819: calling self._execute() 30582 1726855288.85901: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.85906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.85915: variable 'omit' from source: magic vars 30582 1726855288.86213: variable 'ansible_distribution_major_version' from source: facts 30582 1726855288.86223: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855288.86229: variable 'omit' from source: magic vars 30582 1726855288.86257: variable 'omit' from source: magic vars 30582 1726855288.86290: variable 'omit' from source: magic vars 30582 1726855288.86327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855288.86354: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855288.86371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855288.86391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855288.86400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855288.86423: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855288.86426: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.86430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.86505: Set connection var ansible_timeout to 10 30582 1726855288.86509: Set connection var ansible_connection to ssh 30582 1726855288.86514: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855288.86519: Set connection var ansible_pipelining to False 30582 1726855288.86524: Set connection var ansible_shell_executable to /bin/sh 30582 1726855288.86527: Set connection var ansible_shell_type to sh 30582 1726855288.86543: variable 'ansible_shell_executable' from source: unknown 30582 1726855288.86546: variable 'ansible_connection' from source: unknown 30582 1726855288.86549: variable 'ansible_module_compression' from source: unknown 30582 1726855288.86551: variable 'ansible_shell_type' from source: unknown 30582 1726855288.86553: variable 'ansible_shell_executable' from source: unknown 30582 1726855288.86556: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.86558: variable 'ansible_pipelining' from source: unknown 30582 1726855288.86560: variable 'ansible_timeout' from source: unknown 30582 1726855288.86565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.86668: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855288.86679: variable 'omit' from source: magic vars 30582 1726855288.86684: starting attempt loop 30582 1726855288.86689: running the handler 30582 1726855288.86729: variable '__network_connections_result' from source: set_fact 30582 1726855288.86790: variable '__network_connections_result' from source: set_fact 30582 1726855288.86872: handler run complete 30582 1726855288.86894: attempt loop complete, returning result 30582 1726855288.86897: _execute() done 30582 1726855288.86899: dumping result to json 30582 1726855288.86904: done dumping result, returning 30582 1726855288.86912: done running TaskExecutor() for managed_node3/TASK: Show result [0affcc66-ac2b-aa83-7d57-0000000006b2] 30582 1726855288.86916: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000006b2 30582 1726855288.87010: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000006b2 30582 1726855288.87013: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32e4a47f-d12d-469b-92d8-81cf9f125a33\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 32e4a47f-d12d-469b-92d8-81cf9f125a33" ] } } 30582 1726855288.87100: no more pending results, returning what we have 30582 1726855288.87104: results queue empty 30582 1726855288.87105: checking for any_errors_fatal 30582 1726855288.87107: done checking for any_errors_fatal 30582 1726855288.87108: checking for max_fail_percentage 30582 1726855288.87110: done checking for max_fail_percentage 30582 1726855288.87111: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.87111: done checking to see if all hosts have failed 30582 1726855288.87112: getting the remaining hosts for this loop 30582 1726855288.87113: done getting the remaining hosts for this loop 30582 1726855288.87117: getting the next task for host managed_node3 30582 1726855288.87126: done getting next task for host managed_node3 30582 1726855288.87130: ^ task is: TASK: Asserts 30582 1726855288.87132: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.87138: getting variables 30582 1726855288.87140: in VariableManager get_vars() 30582 1726855288.87170: Calling all_inventory to load vars for managed_node3 30582 1726855288.87173: Calling groups_inventory to load vars for managed_node3 30582 1726855288.87176: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.87186: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.87196: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.87200: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.88138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.89008: done with get_vars() 30582 1726855288.89026: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 14:01:28 -0400 (0:00:00.038) 0:00:25.240 ****** 30582 1726855288.89105: entering _queue_task() for managed_node3/include_tasks 30582 1726855288.89363: worker is 1 (out of 1 available) 30582 1726855288.89382: exiting _queue_task() for managed_node3/include_tasks 30582 1726855288.89395: done queuing things up, now waiting for results queue to drain 30582 1726855288.89397: waiting for pending results... 30582 1726855288.89578: running TaskExecutor() for managed_node3/TASK: Asserts 30582 1726855288.89664: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005b9 30582 1726855288.89678: variable 'ansible_search_path' from source: unknown 30582 1726855288.89683: variable 'ansible_search_path' from source: unknown 30582 1726855288.89717: variable 'lsr_assert' from source: include params 30582 1726855288.89881: variable 'lsr_assert' from source: include params 30582 1726855288.89934: variable 'omit' from source: magic vars 30582 1726855288.90030: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.90036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.90044: variable 'omit' from source: magic vars 30582 1726855288.90212: variable 'ansible_distribution_major_version' from source: facts 30582 1726855288.90220: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855288.90225: variable 'item' from source: unknown 30582 1726855288.90272: variable 'item' from source: unknown 30582 1726855288.90297: variable 'item' from source: unknown 30582 1726855288.90339: variable 'item' from source: unknown 30582 1726855288.90465: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.90469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.90471: variable 'omit' from source: magic vars 30582 1726855288.90543: variable 'ansible_distribution_major_version' from source: facts 30582 1726855288.90546: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855288.90552: variable 'item' from source: unknown 30582 1726855288.90598: variable 'item' from source: unknown 30582 1726855288.90618: variable 'item' from source: unknown 30582 1726855288.90660: variable 'item' from source: unknown 30582 1726855288.90729: dumping result to json 30582 1726855288.90731: done dumping result, returning 30582 1726855288.90734: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcc66-ac2b-aa83-7d57-0000000005b9] 30582 1726855288.90735: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b9 30582 1726855288.90767: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005b9 30582 1726855288.90769: WORKER PROCESS EXITING 30582 1726855288.90795: no more pending results, returning what we have 30582 1726855288.90800: in VariableManager get_vars() 30582 1726855288.90835: Calling all_inventory to load vars for managed_node3 30582 1726855288.90837: Calling groups_inventory to load vars for managed_node3 30582 1726855288.90840: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.90854: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.90857: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.90859: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.91678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.92554: done with get_vars() 30582 1726855288.92567: variable 'ansible_search_path' from source: unknown 30582 1726855288.92568: variable 'ansible_search_path' from source: unknown 30582 1726855288.92601: variable 'ansible_search_path' from source: unknown 30582 1726855288.92602: variable 'ansible_search_path' from source: unknown 30582 1726855288.92618: we have included files to process 30582 1726855288.92619: generating all_blocks data 30582 1726855288.92621: done generating all_blocks data 30582 1726855288.92626: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855288.92627: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855288.92629: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855288.92706: in VariableManager get_vars() 30582 1726855288.92719: done with get_vars() 30582 1726855288.92801: done processing included file 30582 1726855288.92802: iterating over new_blocks loaded from include file 30582 1726855288.92803: in VariableManager get_vars() 30582 1726855288.92812: done with get_vars() 30582 1726855288.92814: filtering new block on tags 30582 1726855288.92835: done filtering new block on tags 30582 1726855288.92837: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item=tasks/assert_device_absent.yml) 30582 1726855288.92841: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30582 1726855288.92842: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30582 1726855288.92845: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30582 1726855288.92911: in VariableManager get_vars() 30582 1726855288.92923: done with get_vars() 30582 1726855288.93077: done processing included file 30582 1726855288.93079: iterating over new_blocks loaded from include file 30582 1726855288.93080: in VariableManager get_vars() 30582 1726855288.93090: done with get_vars() 30582 1726855288.93091: filtering new block on tags 30582 1726855288.93118: done filtering new block on tags 30582 1726855288.93120: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=tasks/assert_profile_present.yml) 30582 1726855288.93122: extending task lists for all hosts with included blocks 30582 1726855288.93736: done extending task lists 30582 1726855288.93737: done processing included files 30582 1726855288.93737: results queue empty 30582 1726855288.93738: checking for any_errors_fatal 30582 1726855288.93741: done checking for any_errors_fatal 30582 1726855288.93742: checking for max_fail_percentage 30582 1726855288.93743: done checking for max_fail_percentage 30582 1726855288.93743: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.93744: done checking to see if all hosts have failed 30582 1726855288.93744: getting the remaining hosts for this loop 30582 1726855288.93745: done getting the remaining hosts for this loop 30582 1726855288.93747: getting the next task for host managed_node3 30582 1726855288.93749: done getting next task for host managed_node3 30582 1726855288.93751: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30582 1726855288.93753: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.93758: getting variables 30582 1726855288.93759: in VariableManager get_vars() 30582 1726855288.93765: Calling all_inventory to load vars for managed_node3 30582 1726855288.93766: Calling groups_inventory to load vars for managed_node3 30582 1726855288.93768: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.93774: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.93776: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.93778: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.94445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.95299: done with get_vars() 30582 1726855288.95315: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 14:01:28 -0400 (0:00:00.062) 0:00:25.303 ****** 30582 1726855288.95369: entering _queue_task() for managed_node3/include_tasks 30582 1726855288.95638: worker is 1 (out of 1 available) 30582 1726855288.95653: exiting _queue_task() for managed_node3/include_tasks 30582 1726855288.95665: done queuing things up, now waiting for results queue to drain 30582 1726855288.95666: waiting for pending results... 30582 1726855288.95844: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30582 1726855288.95920: in run() - task 0affcc66-ac2b-aa83-7d57-0000000008a8 30582 1726855288.95932: variable 'ansible_search_path' from source: unknown 30582 1726855288.95936: variable 'ansible_search_path' from source: unknown 30582 1726855288.95962: calling self._execute() 30582 1726855288.96039: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855288.96043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855288.96050: variable 'omit' from source: magic vars 30582 1726855288.96338: variable 'ansible_distribution_major_version' from source: facts 30582 1726855288.96348: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855288.96353: _execute() done 30582 1726855288.96357: dumping result to json 30582 1726855288.96359: done dumping result, returning 30582 1726855288.96367: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-aa83-7d57-0000000008a8] 30582 1726855288.96371: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008a8 30582 1726855288.96457: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008a8 30582 1726855288.96460: WORKER PROCESS EXITING 30582 1726855288.96495: no more pending results, returning what we have 30582 1726855288.96502: in VariableManager get_vars() 30582 1726855288.96538: Calling all_inventory to load vars for managed_node3 30582 1726855288.96541: Calling groups_inventory to load vars for managed_node3 30582 1726855288.96544: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.96558: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.96560: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.96563: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.97386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855288.98352: done with get_vars() 30582 1726855288.98365: variable 'ansible_search_path' from source: unknown 30582 1726855288.98366: variable 'ansible_search_path' from source: unknown 30582 1726855288.98375: variable 'item' from source: include params 30582 1726855288.98458: variable 'item' from source: include params 30582 1726855288.98485: we have included files to process 30582 1726855288.98486: generating all_blocks data 30582 1726855288.98489: done generating all_blocks data 30582 1726855288.98490: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855288.98491: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855288.98492: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855288.98620: done processing included file 30582 1726855288.98621: iterating over new_blocks loaded from include file 30582 1726855288.98622: in VariableManager get_vars() 30582 1726855288.98632: done with get_vars() 30582 1726855288.98633: filtering new block on tags 30582 1726855288.98652: done filtering new block on tags 30582 1726855288.98654: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30582 1726855288.98658: extending task lists for all hosts with included blocks 30582 1726855288.98755: done extending task lists 30582 1726855288.98756: done processing included files 30582 1726855288.98757: results queue empty 30582 1726855288.98757: checking for any_errors_fatal 30582 1726855288.98760: done checking for any_errors_fatal 30582 1726855288.98761: checking for max_fail_percentage 30582 1726855288.98761: done checking for max_fail_percentage 30582 1726855288.98762: checking to see if all hosts have failed and the running result is not ok 30582 1726855288.98763: done checking to see if all hosts have failed 30582 1726855288.98763: getting the remaining hosts for this loop 30582 1726855288.98764: done getting the remaining hosts for this loop 30582 1726855288.98765: getting the next task for host managed_node3 30582 1726855288.98768: done getting next task for host managed_node3 30582 1726855288.98770: ^ task is: TASK: Get stat for interface {{ interface }} 30582 1726855288.98772: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855288.98775: getting variables 30582 1726855288.98776: in VariableManager get_vars() 30582 1726855288.98782: Calling all_inventory to load vars for managed_node3 30582 1726855288.98784: Calling groups_inventory to load vars for managed_node3 30582 1726855288.98785: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855288.98791: Calling all_plugins_play to load vars for managed_node3 30582 1726855288.98792: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855288.98794: Calling groups_plugins_play to load vars for managed_node3 30582 1726855288.99445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855289.00318: done with get_vars() 30582 1726855289.00333: done getting variables 30582 1726855289.00426: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 14:01:29 -0400 (0:00:00.050) 0:00:25.354 ****** 30582 1726855289.00448: entering _queue_task() for managed_node3/stat 30582 1726855289.00734: worker is 1 (out of 1 available) 30582 1726855289.00748: exiting _queue_task() for managed_node3/stat 30582 1726855289.00760: done queuing things up, now waiting for results queue to drain 30582 1726855289.00762: waiting for pending results... 30582 1726855289.00950: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30582 1726855289.01040: in run() - task 0affcc66-ac2b-aa83-7d57-000000000928 30582 1726855289.01050: variable 'ansible_search_path' from source: unknown 30582 1726855289.01054: variable 'ansible_search_path' from source: unknown 30582 1726855289.01084: calling self._execute() 30582 1726855289.01153: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.01157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.01165: variable 'omit' from source: magic vars 30582 1726855289.01437: variable 'ansible_distribution_major_version' from source: facts 30582 1726855289.01446: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855289.01452: variable 'omit' from source: magic vars 30582 1726855289.01492: variable 'omit' from source: magic vars 30582 1726855289.01560: variable 'interface' from source: play vars 30582 1726855289.01574: variable 'omit' from source: magic vars 30582 1726855289.01699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855289.01703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855289.01706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855289.01709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855289.01711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855289.01808: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855289.01812: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.01815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.01843: Set connection var ansible_timeout to 10 30582 1726855289.01847: Set connection var ansible_connection to ssh 30582 1726855289.01853: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855289.01859: Set connection var ansible_pipelining to False 30582 1726855289.01864: Set connection var ansible_shell_executable to /bin/sh 30582 1726855289.01867: Set connection var ansible_shell_type to sh 30582 1726855289.01893: variable 'ansible_shell_executable' from source: unknown 30582 1726855289.01897: variable 'ansible_connection' from source: unknown 30582 1726855289.01899: variable 'ansible_module_compression' from source: unknown 30582 1726855289.01902: variable 'ansible_shell_type' from source: unknown 30582 1726855289.01904: variable 'ansible_shell_executable' from source: unknown 30582 1726855289.01974: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.01978: variable 'ansible_pipelining' from source: unknown 30582 1726855289.01981: variable 'ansible_timeout' from source: unknown 30582 1726855289.01983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.02115: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855289.02131: variable 'omit' from source: magic vars 30582 1726855289.02135: starting attempt loop 30582 1726855289.02137: running the handler 30582 1726855289.02172: _low_level_execute_command(): starting 30582 1726855289.02175: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855289.03313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855289.03318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855289.03403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855289.03407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855289.03410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855289.03412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855289.03442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.03531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.05236: stdout chunk (state=3): >>>/root <<< 30582 1726855289.05371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.05375: stderr chunk (state=3): >>><<< 30582 1726855289.05384: stdout chunk (state=3): >>><<< 30582 1726855289.05485: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855289.05492: _low_level_execute_command(): starting 30582 1726855289.05501: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675 `" && echo ansible-tmp-1726855289.0541341-31748-45037224085675="` echo /root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675 `" ) && sleep 0' 30582 1726855289.06709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855289.06789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855289.06836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855289.06839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.06899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.09401: stdout chunk (state=3): >>>ansible-tmp-1726855289.0541341-31748-45037224085675=/root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675 <<< 30582 1726855289.09406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.09409: stdout chunk (state=3): >>><<< 30582 1726855289.09412: stderr chunk (state=3): >>><<< 30582 1726855289.09415: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855289.0541341-31748-45037224085675=/root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855289.09418: variable 'ansible_module_compression' from source: unknown 30582 1726855289.09420: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855289.09456: variable 'ansible_facts' from source: unknown 30582 1726855289.09796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/AnsiballZ_stat.py 30582 1726855289.09956: Sending initial data 30582 1726855289.09965: Sent initial data (152 bytes) 30582 1726855289.11213: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855289.11329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855289.11333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855289.11507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.11592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.13218: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855289.13307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855289.13359: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpmuq_jvel /root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/AnsiballZ_stat.py <<< 30582 1726855289.13363: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/AnsiballZ_stat.py" <<< 30582 1726855289.13437: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpmuq_jvel" to remote "/root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/AnsiballZ_stat.py" <<< 30582 1726855289.14510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.14663: stderr chunk (state=3): >>><<< 30582 1726855289.14666: stdout chunk (state=3): >>><<< 30582 1726855289.14741: done transferring module to remote 30582 1726855289.14906: _low_level_execute_command(): starting 30582 1726855289.14909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/ /root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/AnsiballZ_stat.py && sleep 0' 30582 1726855289.16195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855289.16200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855289.16338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855289.16343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.16404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.18253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.18324: stderr chunk (state=3): >>><<< 30582 1726855289.18334: stdout chunk (state=3): >>><<< 30582 1726855289.18357: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855289.18368: _low_level_execute_command(): starting 30582 1726855289.18376: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/AnsiballZ_stat.py && sleep 0' 30582 1726855289.19157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855289.19316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855289.19385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855289.19458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855289.19499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.19664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.35133: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855289.36471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.36537: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855289.36547: stdout chunk (state=3): >>><<< 30582 1726855289.36563: stderr chunk (state=3): >>><<< 30582 1726855289.36696: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855289.36700: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855289.36702: _low_level_execute_command(): starting 30582 1726855289.36705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855289.0541341-31748-45037224085675/ > /dev/null 2>&1 && sleep 0' 30582 1726855289.37234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855289.37251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855289.37303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855289.37367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855289.37391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855289.37415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.37500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.39485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.39491: stdout chunk (state=3): >>><<< 30582 1726855289.39497: stderr chunk (state=3): >>><<< 30582 1726855289.39515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855289.39520: handler run complete 30582 1726855289.39544: attempt loop complete, returning result 30582 1726855289.39547: _execute() done 30582 1726855289.39550: dumping result to json 30582 1726855289.39552: done dumping result, returning 30582 1726855289.39692: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcc66-ac2b-aa83-7d57-000000000928] 30582 1726855289.39695: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000928 30582 1726855289.39759: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000928 30582 1726855289.39763: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855289.39836: no more pending results, returning what we have 30582 1726855289.39840: results queue empty 30582 1726855289.39841: checking for any_errors_fatal 30582 1726855289.39843: done checking for any_errors_fatal 30582 1726855289.39844: checking for max_fail_percentage 30582 1726855289.39846: done checking for max_fail_percentage 30582 1726855289.39847: checking to see if all hosts have failed and the running result is not ok 30582 1726855289.39848: done checking to see if all hosts have failed 30582 1726855289.39848: getting the remaining hosts for this loop 30582 1726855289.39850: done getting the remaining hosts for this loop 30582 1726855289.39854: getting the next task for host managed_node3 30582 1726855289.39865: done getting next task for host managed_node3 30582 1726855289.39868: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30582 1726855289.39872: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855289.39880: getting variables 30582 1726855289.39882: in VariableManager get_vars() 30582 1726855289.40036: Calling all_inventory to load vars for managed_node3 30582 1726855289.40039: Calling groups_inventory to load vars for managed_node3 30582 1726855289.40042: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855289.40054: Calling all_plugins_play to load vars for managed_node3 30582 1726855289.40057: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855289.40060: Calling groups_plugins_play to load vars for managed_node3 30582 1726855289.41700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855289.43383: done with get_vars() 30582 1726855289.43415: done getting variables 30582 1726855289.43490: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855289.43616: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 14:01:29 -0400 (0:00:00.431) 0:00:25.786 ****** 30582 1726855289.43649: entering _queue_task() for managed_node3/assert 30582 1726855289.44220: worker is 1 (out of 1 available) 30582 1726855289.44230: exiting _queue_task() for managed_node3/assert 30582 1726855289.44240: done queuing things up, now waiting for results queue to drain 30582 1726855289.44242: waiting for pending results... 30582 1726855289.44368: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30582 1726855289.44485: in run() - task 0affcc66-ac2b-aa83-7d57-0000000008a9 30582 1726855289.44508: variable 'ansible_search_path' from source: unknown 30582 1726855289.44511: variable 'ansible_search_path' from source: unknown 30582 1726855289.44593: calling self._execute() 30582 1726855289.44648: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.44652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.44664: variable 'omit' from source: magic vars 30582 1726855289.45060: variable 'ansible_distribution_major_version' from source: facts 30582 1726855289.45073: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855289.45083: variable 'omit' from source: magic vars 30582 1726855289.45193: variable 'omit' from source: magic vars 30582 1726855289.45234: variable 'interface' from source: play vars 30582 1726855289.45258: variable 'omit' from source: magic vars 30582 1726855289.45303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855289.45338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855289.45370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855289.45392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855289.45405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855289.45435: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855289.45439: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.45442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.45592: Set connection var ansible_timeout to 10 30582 1726855289.45596: Set connection var ansible_connection to ssh 30582 1726855289.45598: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855289.45600: Set connection var ansible_pipelining to False 30582 1726855289.45602: Set connection var ansible_shell_executable to /bin/sh 30582 1726855289.45604: Set connection var ansible_shell_type to sh 30582 1726855289.45614: variable 'ansible_shell_executable' from source: unknown 30582 1726855289.45617: variable 'ansible_connection' from source: unknown 30582 1726855289.45619: variable 'ansible_module_compression' from source: unknown 30582 1726855289.45621: variable 'ansible_shell_type' from source: unknown 30582 1726855289.45626: variable 'ansible_shell_executable' from source: unknown 30582 1726855289.45629: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.45793: variable 'ansible_pipelining' from source: unknown 30582 1726855289.45797: variable 'ansible_timeout' from source: unknown 30582 1726855289.45800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.45822: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855289.45825: variable 'omit' from source: magic vars 30582 1726855289.45828: starting attempt loop 30582 1726855289.45830: running the handler 30582 1726855289.45983: variable 'interface_stat' from source: set_fact 30582 1726855289.45999: Evaluated conditional (not interface_stat.stat.exists): True 30582 1726855289.46004: handler run complete 30582 1726855289.46019: attempt loop complete, returning result 30582 1726855289.46026: _execute() done 30582 1726855289.46029: dumping result to json 30582 1726855289.46034: done dumping result, returning 30582 1726855289.46046: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcc66-ac2b-aa83-7d57-0000000008a9] 30582 1726855289.46049: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008a9 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855289.46320: no more pending results, returning what we have 30582 1726855289.46325: results queue empty 30582 1726855289.46327: checking for any_errors_fatal 30582 1726855289.46340: done checking for any_errors_fatal 30582 1726855289.46341: checking for max_fail_percentage 30582 1726855289.46343: done checking for max_fail_percentage 30582 1726855289.46344: checking to see if all hosts have failed and the running result is not ok 30582 1726855289.46344: done checking to see if all hosts have failed 30582 1726855289.46345: getting the remaining hosts for this loop 30582 1726855289.46347: done getting the remaining hosts for this loop 30582 1726855289.46351: getting the next task for host managed_node3 30582 1726855289.46362: done getting next task for host managed_node3 30582 1726855289.46365: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30582 1726855289.46369: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855289.46377: getting variables 30582 1726855289.46380: in VariableManager get_vars() 30582 1726855289.46418: Calling all_inventory to load vars for managed_node3 30582 1726855289.46421: Calling groups_inventory to load vars for managed_node3 30582 1726855289.46539: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855289.46551: Calling all_plugins_play to load vars for managed_node3 30582 1726855289.46558: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855289.46564: Calling groups_plugins_play to load vars for managed_node3 30582 1726855289.47084: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008a9 30582 1726855289.47090: WORKER PROCESS EXITING 30582 1726855289.47966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855289.49632: done with get_vars() 30582 1726855289.49665: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 14:01:29 -0400 (0:00:00.061) 0:00:25.847 ****** 30582 1726855289.49768: entering _queue_task() for managed_node3/include_tasks 30582 1726855289.50149: worker is 1 (out of 1 available) 30582 1726855289.50162: exiting _queue_task() for managed_node3/include_tasks 30582 1726855289.50175: done queuing things up, now waiting for results queue to drain 30582 1726855289.50177: waiting for pending results... 30582 1726855289.50606: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30582 1726855289.50611: in run() - task 0affcc66-ac2b-aa83-7d57-0000000008ad 30582 1726855289.50624: variable 'ansible_search_path' from source: unknown 30582 1726855289.50632: variable 'ansible_search_path' from source: unknown 30582 1726855289.50675: calling self._execute() 30582 1726855289.50786: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.50800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.50816: variable 'omit' from source: magic vars 30582 1726855289.51190: variable 'ansible_distribution_major_version' from source: facts 30582 1726855289.51210: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855289.51222: _execute() done 30582 1726855289.51230: dumping result to json 30582 1726855289.51237: done dumping result, returning 30582 1726855289.51273: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-aa83-7d57-0000000008ad] 30582 1726855289.51276: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008ad 30582 1726855289.51519: no more pending results, returning what we have 30582 1726855289.51526: in VariableManager get_vars() 30582 1726855289.51569: Calling all_inventory to load vars for managed_node3 30582 1726855289.51572: Calling groups_inventory to load vars for managed_node3 30582 1726855289.51576: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855289.51593: Calling all_plugins_play to load vars for managed_node3 30582 1726855289.51597: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855289.51600: Calling groups_plugins_play to load vars for managed_node3 30582 1726855289.52200: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008ad 30582 1726855289.52204: WORKER PROCESS EXITING 30582 1726855289.53344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855289.56750: done with get_vars() 30582 1726855289.56781: variable 'ansible_search_path' from source: unknown 30582 1726855289.56783: variable 'ansible_search_path' from source: unknown 30582 1726855289.56797: variable 'item' from source: include params 30582 1726855289.57019: variable 'item' from source: include params 30582 1726855289.57054: we have included files to process 30582 1726855289.57056: generating all_blocks data 30582 1726855289.57057: done generating all_blocks data 30582 1726855289.57070: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855289.57072: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855289.57078: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855289.58093: done processing included file 30582 1726855289.58095: iterating over new_blocks loaded from include file 30582 1726855289.58097: in VariableManager get_vars() 30582 1726855289.58114: done with get_vars() 30582 1726855289.58116: filtering new block on tags 30582 1726855289.58240: done filtering new block on tags 30582 1726855289.58244: in VariableManager get_vars() 30582 1726855289.58259: done with get_vars() 30582 1726855289.58262: filtering new block on tags 30582 1726855289.58321: done filtering new block on tags 30582 1726855289.58324: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30582 1726855289.58329: extending task lists for all hosts with included blocks 30582 1726855289.59031: done extending task lists 30582 1726855289.59032: done processing included files 30582 1726855289.59033: results queue empty 30582 1726855289.59034: checking for any_errors_fatal 30582 1726855289.59037: done checking for any_errors_fatal 30582 1726855289.59038: checking for max_fail_percentage 30582 1726855289.59039: done checking for max_fail_percentage 30582 1726855289.59040: checking to see if all hosts have failed and the running result is not ok 30582 1726855289.59041: done checking to see if all hosts have failed 30582 1726855289.59041: getting the remaining hosts for this loop 30582 1726855289.59043: done getting the remaining hosts for this loop 30582 1726855289.59045: getting the next task for host managed_node3 30582 1726855289.59050: done getting next task for host managed_node3 30582 1726855289.59052: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855289.59055: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855289.59057: getting variables 30582 1726855289.59058: in VariableManager get_vars() 30582 1726855289.59069: Calling all_inventory to load vars for managed_node3 30582 1726855289.59071: Calling groups_inventory to load vars for managed_node3 30582 1726855289.59073: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855289.59079: Calling all_plugins_play to load vars for managed_node3 30582 1726855289.59081: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855289.59084: Calling groups_plugins_play to load vars for managed_node3 30582 1726855289.61058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855289.62733: done with get_vars() 30582 1726855289.62757: done getting variables 30582 1726855289.62808: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 14:01:29 -0400 (0:00:00.130) 0:00:25.978 ****** 30582 1726855289.62841: entering _queue_task() for managed_node3/set_fact 30582 1726855289.63210: worker is 1 (out of 1 available) 30582 1726855289.63222: exiting _queue_task() for managed_node3/set_fact 30582 1726855289.63234: done queuing things up, now waiting for results queue to drain 30582 1726855289.63236: waiting for pending results... 30582 1726855289.63539: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855289.63660: in run() - task 0affcc66-ac2b-aa83-7d57-000000000946 30582 1726855289.63682: variable 'ansible_search_path' from source: unknown 30582 1726855289.63692: variable 'ansible_search_path' from source: unknown 30582 1726855289.63735: calling self._execute() 30582 1726855289.63844: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.64302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.64306: variable 'omit' from source: magic vars 30582 1726855289.64906: variable 'ansible_distribution_major_version' from source: facts 30582 1726855289.64933: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855289.65092: variable 'omit' from source: magic vars 30582 1726855289.65100: variable 'omit' from source: magic vars 30582 1726855289.65354: variable 'omit' from source: magic vars 30582 1726855289.65358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855289.65360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855289.65435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855289.65459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855289.65498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855289.65533: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855289.65542: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.65550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.65697: Set connection var ansible_timeout to 10 30582 1726855289.65739: Set connection var ansible_connection to ssh 30582 1726855289.65742: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855289.65745: Set connection var ansible_pipelining to False 30582 1726855289.65747: Set connection var ansible_shell_executable to /bin/sh 30582 1726855289.65748: Set connection var ansible_shell_type to sh 30582 1726855289.65772: variable 'ansible_shell_executable' from source: unknown 30582 1726855289.65799: variable 'ansible_connection' from source: unknown 30582 1726855289.65808: variable 'ansible_module_compression' from source: unknown 30582 1726855289.65814: variable 'ansible_shell_type' from source: unknown 30582 1726855289.65820: variable 'ansible_shell_executable' from source: unknown 30582 1726855289.65826: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.65833: variable 'ansible_pipelining' from source: unknown 30582 1726855289.65839: variable 'ansible_timeout' from source: unknown 30582 1726855289.65920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.66015: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855289.66032: variable 'omit' from source: magic vars 30582 1726855289.66043: starting attempt loop 30582 1726855289.66049: running the handler 30582 1726855289.66074: handler run complete 30582 1726855289.66092: attempt loop complete, returning result 30582 1726855289.66100: _execute() done 30582 1726855289.66107: dumping result to json 30582 1726855289.66115: done dumping result, returning 30582 1726855289.66134: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-aa83-7d57-000000000946] 30582 1726855289.66144: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000946 30582 1726855289.66305: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000946 30582 1726855289.66308: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30582 1726855289.66408: no more pending results, returning what we have 30582 1726855289.66412: results queue empty 30582 1726855289.66414: checking for any_errors_fatal 30582 1726855289.66416: done checking for any_errors_fatal 30582 1726855289.66417: checking for max_fail_percentage 30582 1726855289.66419: done checking for max_fail_percentage 30582 1726855289.66420: checking to see if all hosts have failed and the running result is not ok 30582 1726855289.66421: done checking to see if all hosts have failed 30582 1726855289.66421: getting the remaining hosts for this loop 30582 1726855289.66423: done getting the remaining hosts for this loop 30582 1726855289.66427: getting the next task for host managed_node3 30582 1726855289.66437: done getting next task for host managed_node3 30582 1726855289.66440: ^ task is: TASK: Stat profile file 30582 1726855289.66446: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855289.66450: getting variables 30582 1726855289.66452: in VariableManager get_vars() 30582 1726855289.66491: Calling all_inventory to load vars for managed_node3 30582 1726855289.66494: Calling groups_inventory to load vars for managed_node3 30582 1726855289.66498: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855289.66510: Calling all_plugins_play to load vars for managed_node3 30582 1726855289.66514: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855289.66517: Calling groups_plugins_play to load vars for managed_node3 30582 1726855289.69259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855289.72282: done with get_vars() 30582 1726855289.72315: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 14:01:29 -0400 (0:00:00.095) 0:00:26.074 ****** 30582 1726855289.72414: entering _queue_task() for managed_node3/stat 30582 1726855289.72778: worker is 1 (out of 1 available) 30582 1726855289.72993: exiting _queue_task() for managed_node3/stat 30582 1726855289.73005: done queuing things up, now waiting for results queue to drain 30582 1726855289.73007: waiting for pending results... 30582 1726855289.73206: running TaskExecutor() for managed_node3/TASK: Stat profile file 30582 1726855289.73240: in run() - task 0affcc66-ac2b-aa83-7d57-000000000947 30582 1726855289.73262: variable 'ansible_search_path' from source: unknown 30582 1726855289.73342: variable 'ansible_search_path' from source: unknown 30582 1726855289.73346: calling self._execute() 30582 1726855289.73408: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.73418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.73431: variable 'omit' from source: magic vars 30582 1726855289.73814: variable 'ansible_distribution_major_version' from source: facts 30582 1726855289.73832: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855289.73842: variable 'omit' from source: magic vars 30582 1726855289.73907: variable 'omit' from source: magic vars 30582 1726855289.74007: variable 'profile' from source: play vars 30582 1726855289.74017: variable 'interface' from source: play vars 30582 1726855289.74084: variable 'interface' from source: play vars 30582 1726855289.74115: variable 'omit' from source: magic vars 30582 1726855289.74161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855289.74211: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855289.74319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855289.74323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855289.74325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855289.74328: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855289.74330: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.74332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.74424: Set connection var ansible_timeout to 10 30582 1726855289.74434: Set connection var ansible_connection to ssh 30582 1726855289.74445: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855289.74453: Set connection var ansible_pipelining to False 30582 1726855289.74461: Set connection var ansible_shell_executable to /bin/sh 30582 1726855289.74467: Set connection var ansible_shell_type to sh 30582 1726855289.74539: variable 'ansible_shell_executable' from source: unknown 30582 1726855289.74542: variable 'ansible_connection' from source: unknown 30582 1726855289.74544: variable 'ansible_module_compression' from source: unknown 30582 1726855289.74546: variable 'ansible_shell_type' from source: unknown 30582 1726855289.74548: variable 'ansible_shell_executable' from source: unknown 30582 1726855289.74550: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855289.74552: variable 'ansible_pipelining' from source: unknown 30582 1726855289.74554: variable 'ansible_timeout' from source: unknown 30582 1726855289.74556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855289.74747: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855289.74769: variable 'omit' from source: magic vars 30582 1726855289.74779: starting attempt loop 30582 1726855289.74867: running the handler 30582 1726855289.74870: _low_level_execute_command(): starting 30582 1726855289.74873: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855289.75527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855289.75550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855289.75565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855289.75608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855289.75621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855289.75634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855289.75720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855289.76102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.76204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.77884: stdout chunk (state=3): >>>/root <<< 30582 1726855289.78041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.78044: stdout chunk (state=3): >>><<< 30582 1726855289.78046: stderr chunk (state=3): >>><<< 30582 1726855289.78065: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855289.78086: _low_level_execute_command(): starting 30582 1726855289.78374: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574 `" && echo ansible-tmp-1726855289.7807431-31803-140101694264574="` echo /root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574 `" ) && sleep 0' 30582 1726855289.79566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855289.79580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855289.79807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855289.79906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.80000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.82017: stdout chunk (state=3): >>>ansible-tmp-1726855289.7807431-31803-140101694264574=/root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574 <<< 30582 1726855289.82147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.82156: stdout chunk (state=3): >>><<< 30582 1726855289.82168: stderr chunk (state=3): >>><<< 30582 1726855289.82197: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855289.7807431-31803-140101694264574=/root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855289.82394: variable 'ansible_module_compression' from source: unknown 30582 1726855289.82603: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855289.82711: variable 'ansible_facts' from source: unknown 30582 1726855289.82750: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/AnsiballZ_stat.py 30582 1726855289.83026: Sending initial data 30582 1726855289.83039: Sent initial data (153 bytes) 30582 1726855289.84407: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855289.84598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.84657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.86382: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855289.86433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855289.86512: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpl_iq8ekd /root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/AnsiballZ_stat.py <<< 30582 1726855289.86515: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/AnsiballZ_stat.py" <<< 30582 1726855289.86560: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpl_iq8ekd" to remote "/root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/AnsiballZ_stat.py" <<< 30582 1726855289.88097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.88108: stdout chunk (state=3): >>><<< 30582 1726855289.88119: stderr chunk (state=3): >>><<< 30582 1726855289.88298: done transferring module to remote 30582 1726855289.88301: _low_level_execute_command(): starting 30582 1726855289.88304: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/ /root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/AnsiballZ_stat.py && sleep 0' 30582 1726855289.89829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855289.89939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855289.90191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.90246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855289.92177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855289.92181: stdout chunk (state=3): >>><<< 30582 1726855289.92183: stderr chunk (state=3): >>><<< 30582 1726855289.92204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855289.92215: _low_level_execute_command(): starting 30582 1726855289.92225: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/AnsiballZ_stat.py && sleep 0' 30582 1726855289.93452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855289.93503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855289.93724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855289.93910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855290.09419: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855290.10919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855290.10924: stdout chunk (state=3): >>><<< 30582 1726855290.10926: stderr chunk (state=3): >>><<< 30582 1726855290.10952: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855290.10993: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855290.11379: _low_level_execute_command(): starting 30582 1726855290.11383: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855289.7807431-31803-140101694264574/ > /dev/null 2>&1 && sleep 0' 30582 1726855290.12196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855290.12491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855290.12500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855290.12793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855290.14578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855290.14629: stderr chunk (state=3): >>><<< 30582 1726855290.14805: stdout chunk (state=3): >>><<< 30582 1726855290.14825: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855290.14831: handler run complete 30582 1726855290.14854: attempt loop complete, returning result 30582 1726855290.14857: _execute() done 30582 1726855290.14859: dumping result to json 30582 1726855290.14862: done dumping result, returning 30582 1726855290.14872: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcc66-ac2b-aa83-7d57-000000000947] 30582 1726855290.14877: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000947 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855290.15156: no more pending results, returning what we have 30582 1726855290.15160: results queue empty 30582 1726855290.15162: checking for any_errors_fatal 30582 1726855290.15177: done checking for any_errors_fatal 30582 1726855290.15178: checking for max_fail_percentage 30582 1726855290.15180: done checking for max_fail_percentage 30582 1726855290.15181: checking to see if all hosts have failed and the running result is not ok 30582 1726855290.15182: done checking to see if all hosts have failed 30582 1726855290.15183: getting the remaining hosts for this loop 30582 1726855290.15185: done getting the remaining hosts for this loop 30582 1726855290.15193: getting the next task for host managed_node3 30582 1726855290.15201: done getting next task for host managed_node3 30582 1726855290.15204: ^ task is: TASK: Set NM profile exist flag based on the profile files 30582 1726855290.15207: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855290.15211: getting variables 30582 1726855290.15213: in VariableManager get_vars() 30582 1726855290.15244: Calling all_inventory to load vars for managed_node3 30582 1726855290.15246: Calling groups_inventory to load vars for managed_node3 30582 1726855290.15249: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855290.15260: Calling all_plugins_play to load vars for managed_node3 30582 1726855290.15263: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855290.15265: Calling groups_plugins_play to load vars for managed_node3 30582 1726855290.16097: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000947 30582 1726855290.16101: WORKER PROCESS EXITING 30582 1726855290.18647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855290.22194: done with get_vars() 30582 1726855290.22225: done getting variables 30582 1726855290.22293: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 14:01:30 -0400 (0:00:00.499) 0:00:26.573 ****** 30582 1726855290.22328: entering _queue_task() for managed_node3/set_fact 30582 1726855290.23306: worker is 1 (out of 1 available) 30582 1726855290.23318: exiting _queue_task() for managed_node3/set_fact 30582 1726855290.23329: done queuing things up, now waiting for results queue to drain 30582 1726855290.23330: waiting for pending results... 30582 1726855290.23751: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30582 1726855290.24144: in run() - task 0affcc66-ac2b-aa83-7d57-000000000948 30582 1726855290.24165: variable 'ansible_search_path' from source: unknown 30582 1726855290.24292: variable 'ansible_search_path' from source: unknown 30582 1726855290.24296: calling self._execute() 30582 1726855290.24696: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855290.24699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855290.24704: variable 'omit' from source: magic vars 30582 1726855290.25299: variable 'ansible_distribution_major_version' from source: facts 30582 1726855290.25363: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855290.25563: variable 'profile_stat' from source: set_fact 30582 1726855290.25666: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855290.25724: when evaluation is False, skipping this task 30582 1726855290.25733: _execute() done 30582 1726855290.25774: dumping result to json 30582 1726855290.25786: done dumping result, returning 30582 1726855290.25806: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-aa83-7d57-000000000948] 30582 1726855290.25832: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000948 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855290.26296: no more pending results, returning what we have 30582 1726855290.26301: results queue empty 30582 1726855290.26303: checking for any_errors_fatal 30582 1726855290.26314: done checking for any_errors_fatal 30582 1726855290.26315: checking for max_fail_percentage 30582 1726855290.26317: done checking for max_fail_percentage 30582 1726855290.26318: checking to see if all hosts have failed and the running result is not ok 30582 1726855290.26318: done checking to see if all hosts have failed 30582 1726855290.26319: getting the remaining hosts for this loop 30582 1726855290.26321: done getting the remaining hosts for this loop 30582 1726855290.26326: getting the next task for host managed_node3 30582 1726855290.26336: done getting next task for host managed_node3 30582 1726855290.26339: ^ task is: TASK: Get NM profile info 30582 1726855290.26345: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855290.26351: getting variables 30582 1726855290.26353: in VariableManager get_vars() 30582 1726855290.26395: Calling all_inventory to load vars for managed_node3 30582 1726855290.26398: Calling groups_inventory to load vars for managed_node3 30582 1726855290.26402: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855290.26416: Calling all_plugins_play to load vars for managed_node3 30582 1726855290.26420: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855290.26424: Calling groups_plugins_play to load vars for managed_node3 30582 1726855290.27054: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000948 30582 1726855290.27058: WORKER PROCESS EXITING 30582 1726855290.29725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855290.33148: done with get_vars() 30582 1726855290.33183: done getting variables 30582 1726855290.33349: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 14:01:30 -0400 (0:00:00.111) 0:00:26.685 ****** 30582 1726855290.33526: entering _queue_task() for managed_node3/shell 30582 1726855290.34346: worker is 1 (out of 1 available) 30582 1726855290.34358: exiting _queue_task() for managed_node3/shell 30582 1726855290.34371: done queuing things up, now waiting for results queue to drain 30582 1726855290.34376: waiting for pending results... 30582 1726855290.35196: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30582 1726855290.35290: in run() - task 0affcc66-ac2b-aa83-7d57-000000000949 30582 1726855290.35708: variable 'ansible_search_path' from source: unknown 30582 1726855290.35712: variable 'ansible_search_path' from source: unknown 30582 1726855290.35758: calling self._execute() 30582 1726855290.35847: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855290.35851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855290.35876: variable 'omit' from source: magic vars 30582 1726855290.37094: variable 'ansible_distribution_major_version' from source: facts 30582 1726855290.37098: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855290.37101: variable 'omit' from source: magic vars 30582 1726855290.37543: variable 'omit' from source: magic vars 30582 1726855290.37648: variable 'profile' from source: play vars 30582 1726855290.37652: variable 'interface' from source: play vars 30582 1726855290.38180: variable 'interface' from source: play vars 30582 1726855290.38184: variable 'omit' from source: magic vars 30582 1726855290.38186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855290.38223: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855290.38243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855290.38263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855290.38282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855290.38730: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855290.38734: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855290.38737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855290.38940: Set connection var ansible_timeout to 10 30582 1726855290.38943: Set connection var ansible_connection to ssh 30582 1726855290.38945: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855290.38947: Set connection var ansible_pipelining to False 30582 1726855290.38949: Set connection var ansible_shell_executable to /bin/sh 30582 1726855290.38951: Set connection var ansible_shell_type to sh 30582 1726855290.39294: variable 'ansible_shell_executable' from source: unknown 30582 1726855290.39298: variable 'ansible_connection' from source: unknown 30582 1726855290.39301: variable 'ansible_module_compression' from source: unknown 30582 1726855290.39305: variable 'ansible_shell_type' from source: unknown 30582 1726855290.39308: variable 'ansible_shell_executable' from source: unknown 30582 1726855290.39311: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855290.39315: variable 'ansible_pipelining' from source: unknown 30582 1726855290.39318: variable 'ansible_timeout' from source: unknown 30582 1726855290.39322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855290.39464: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855290.39481: variable 'omit' from source: magic vars 30582 1726855290.39489: starting attempt loop 30582 1726855290.39492: running the handler 30582 1726855290.39896: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855290.39923: _low_level_execute_command(): starting 30582 1726855290.39926: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855290.41143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855290.41152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855290.41232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855290.41236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855290.41239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855290.41459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855290.41463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855290.41560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855290.43408: stdout chunk (state=3): >>>/root <<< 30582 1726855290.43413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855290.43422: stdout chunk (state=3): >>><<< 30582 1726855290.43424: stderr chunk (state=3): >>><<< 30582 1726855290.43449: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855290.43464: _low_level_execute_command(): starting 30582 1726855290.43470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715 `" && echo ansible-tmp-1726855290.434489-31838-55792078148715="` echo /root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715 `" ) && sleep 0' 30582 1726855290.44794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855290.44799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855290.44801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855290.44804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855290.44806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855290.44978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855290.44982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855290.44985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855290.45068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855290.47006: stdout chunk (state=3): >>>ansible-tmp-1726855290.434489-31838-55792078148715=/root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715 <<< 30582 1726855290.47128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855290.47178: stderr chunk (state=3): >>><<< 30582 1726855290.47182: stdout chunk (state=3): >>><<< 30582 1726855290.47221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855290.434489-31838-55792078148715=/root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855290.47238: variable 'ansible_module_compression' from source: unknown 30582 1726855290.47545: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855290.47548: variable 'ansible_facts' from source: unknown 30582 1726855290.48016: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/AnsiballZ_command.py 30582 1726855290.48511: Sending initial data 30582 1726855290.48514: Sent initial data (154 bytes) 30582 1726855290.49507: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855290.49511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855290.49683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855290.49691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855290.49694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855290.49697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855290.49699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855290.49702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855290.50002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855290.51552: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855290.51713: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855290.51775: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpg8gcqmb7 /root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/AnsiballZ_command.py <<< 30582 1726855290.51779: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/AnsiballZ_command.py" <<< 30582 1726855290.51848: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpg8gcqmb7" to remote "/root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/AnsiballZ_command.py" <<< 30582 1726855290.51851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/AnsiballZ_command.py" <<< 30582 1726855290.53283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855290.53392: stderr chunk (state=3): >>><<< 30582 1726855290.53395: stdout chunk (state=3): >>><<< 30582 1726855290.53398: done transferring module to remote 30582 1726855290.53400: _low_level_execute_command(): starting 30582 1726855290.53402: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/ /root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/AnsiballZ_command.py && sleep 0' 30582 1726855290.54423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855290.54693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855290.54895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855290.54898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855290.54901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855290.56609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855290.56660: stderr chunk (state=3): >>><<< 30582 1726855290.56806: stdout chunk (state=3): >>><<< 30582 1726855290.56826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855290.56829: _low_level_execute_command(): starting 30582 1726855290.56834: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/AnsiballZ_command.py && sleep 0' 30582 1726855290.57899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855290.58190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855290.58206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855290.58218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855290.58312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855290.75189: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:01:30.733411", "end": "2024-09-20 14:01:30.750893", "delta": "0:00:00.017482", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855290.76767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855290.76829: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855290.76984: stderr chunk (state=3): >>><<< 30582 1726855290.77024: stdout chunk (state=3): >>><<< 30582 1726855290.77106: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:01:30.733411", "end": "2024-09-20 14:01:30.750893", "delta": "0:00:00.017482", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855290.77110: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855290.77352: _low_level_execute_command(): starting 30582 1726855290.77355: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855290.434489-31838-55792078148715/ > /dev/null 2>&1 && sleep 0' 30582 1726855290.78240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855290.78298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855290.78302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855290.78315: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855290.78321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855290.78342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855290.78361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855290.78367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855290.78529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855290.78591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855290.80522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855290.80525: stdout chunk (state=3): >>><<< 30582 1726855290.80528: stderr chunk (state=3): >>><<< 30582 1726855290.80693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855290.80697: handler run complete 30582 1726855290.80699: Evaluated conditional (False): False 30582 1726855290.80701: attempt loop complete, returning result 30582 1726855290.80703: _execute() done 30582 1726855290.80705: dumping result to json 30582 1726855290.80707: done dumping result, returning 30582 1726855290.80709: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcc66-ac2b-aa83-7d57-000000000949] 30582 1726855290.80711: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000949 30582 1726855290.80785: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000949 ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017482", "end": "2024-09-20 14:01:30.750893", "rc": 0, "start": "2024-09-20 14:01:30.733411" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30582 1726855290.80864: no more pending results, returning what we have 30582 1726855290.80869: results queue empty 30582 1726855290.80870: checking for any_errors_fatal 30582 1726855290.80881: done checking for any_errors_fatal 30582 1726855290.80882: checking for max_fail_percentage 30582 1726855290.80885: done checking for max_fail_percentage 30582 1726855290.80886: checking to see if all hosts have failed and the running result is not ok 30582 1726855290.80886: done checking to see if all hosts have failed 30582 1726855290.80889: getting the remaining hosts for this loop 30582 1726855290.80891: done getting the remaining hosts for this loop 30582 1726855290.80894: getting the next task for host managed_node3 30582 1726855290.80905: done getting next task for host managed_node3 30582 1726855290.80908: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855290.80913: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855290.80918: getting variables 30582 1726855290.80920: in VariableManager get_vars() 30582 1726855290.80952: Calling all_inventory to load vars for managed_node3 30582 1726855290.80955: Calling groups_inventory to load vars for managed_node3 30582 1726855290.80958: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855290.80969: Calling all_plugins_play to load vars for managed_node3 30582 1726855290.80975: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855290.80978: Calling groups_plugins_play to load vars for managed_node3 30582 1726855290.81554: WORKER PROCESS EXITING 30582 1726855290.83874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855290.85737: done with get_vars() 30582 1726855290.85765: done getting variables 30582 1726855290.85842: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 14:01:30 -0400 (0:00:00.523) 0:00:27.208 ****** 30582 1726855290.85882: entering _queue_task() for managed_node3/set_fact 30582 1726855290.86305: worker is 1 (out of 1 available) 30582 1726855290.86318: exiting _queue_task() for managed_node3/set_fact 30582 1726855290.86330: done queuing things up, now waiting for results queue to drain 30582 1726855290.86331: waiting for pending results... 30582 1726855290.86710: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855290.86754: in run() - task 0affcc66-ac2b-aa83-7d57-00000000094a 30582 1726855290.86776: variable 'ansible_search_path' from source: unknown 30582 1726855290.86784: variable 'ansible_search_path' from source: unknown 30582 1726855290.86880: calling self._execute() 30582 1726855290.87396: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855290.87401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855290.87405: variable 'omit' from source: magic vars 30582 1726855290.87740: variable 'ansible_distribution_major_version' from source: facts 30582 1726855290.87750: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855290.87893: variable 'nm_profile_exists' from source: set_fact 30582 1726855290.87897: Evaluated conditional (nm_profile_exists.rc == 0): True 30582 1726855290.87902: variable 'omit' from source: magic vars 30582 1726855290.87956: variable 'omit' from source: magic vars 30582 1726855290.87985: variable 'omit' from source: magic vars 30582 1726855290.88031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855290.88059: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855290.88079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855290.88098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855290.88110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855290.88139: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855290.88142: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855290.88145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855290.88292: Set connection var ansible_timeout to 10 30582 1726855290.88296: Set connection var ansible_connection to ssh 30582 1726855290.88299: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855290.88301: Set connection var ansible_pipelining to False 30582 1726855290.88303: Set connection var ansible_shell_executable to /bin/sh 30582 1726855290.88306: Set connection var ansible_shell_type to sh 30582 1726855290.88308: variable 'ansible_shell_executable' from source: unknown 30582 1726855290.88310: variable 'ansible_connection' from source: unknown 30582 1726855290.88312: variable 'ansible_module_compression' from source: unknown 30582 1726855290.88313: variable 'ansible_shell_type' from source: unknown 30582 1726855290.88315: variable 'ansible_shell_executable' from source: unknown 30582 1726855290.88317: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855290.88319: variable 'ansible_pipelining' from source: unknown 30582 1726855290.88321: variable 'ansible_timeout' from source: unknown 30582 1726855290.88324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855290.88584: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855290.88589: variable 'omit' from source: magic vars 30582 1726855290.88591: starting attempt loop 30582 1726855290.88593: running the handler 30582 1726855290.88594: handler run complete 30582 1726855290.88596: attempt loop complete, returning result 30582 1726855290.88598: _execute() done 30582 1726855290.88599: dumping result to json 30582 1726855290.88601: done dumping result, returning 30582 1726855290.88603: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-aa83-7d57-00000000094a] 30582 1726855290.88604: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094a 30582 1726855290.88666: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094a 30582 1726855290.88669: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30582 1726855290.88745: no more pending results, returning what we have 30582 1726855290.88748: results queue empty 30582 1726855290.88749: checking for any_errors_fatal 30582 1726855290.88754: done checking for any_errors_fatal 30582 1726855290.88755: checking for max_fail_percentage 30582 1726855290.88756: done checking for max_fail_percentage 30582 1726855290.88757: checking to see if all hosts have failed and the running result is not ok 30582 1726855290.88758: done checking to see if all hosts have failed 30582 1726855290.88758: getting the remaining hosts for this loop 30582 1726855290.88759: done getting the remaining hosts for this loop 30582 1726855290.88763: getting the next task for host managed_node3 30582 1726855290.88775: done getting next task for host managed_node3 30582 1726855290.88777: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855290.88781: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855290.88785: getting variables 30582 1726855290.88786: in VariableManager get_vars() 30582 1726855290.88914: Calling all_inventory to load vars for managed_node3 30582 1726855290.88918: Calling groups_inventory to load vars for managed_node3 30582 1726855290.88921: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855290.88932: Calling all_plugins_play to load vars for managed_node3 30582 1726855290.88935: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855290.88938: Calling groups_plugins_play to load vars for managed_node3 30582 1726855290.91228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855290.93161: done with get_vars() 30582 1726855290.93186: done getting variables 30582 1726855290.93254: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855290.93384: variable 'profile' from source: play vars 30582 1726855290.93390: variable 'interface' from source: play vars 30582 1726855290.93445: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 14:01:30 -0400 (0:00:00.076) 0:00:27.284 ****** 30582 1726855290.93490: entering _queue_task() for managed_node3/command 30582 1726855290.93854: worker is 1 (out of 1 available) 30582 1726855290.93869: exiting _queue_task() for managed_node3/command 30582 1726855290.93886: done queuing things up, now waiting for results queue to drain 30582 1726855290.94025: waiting for pending results... 30582 1726855290.94262: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30582 1726855290.94694: in run() - task 0affcc66-ac2b-aa83-7d57-00000000094c 30582 1726855290.94700: variable 'ansible_search_path' from source: unknown 30582 1726855290.94703: variable 'ansible_search_path' from source: unknown 30582 1726855290.94706: calling self._execute() 30582 1726855290.94931: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855290.94935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855290.94937: variable 'omit' from source: magic vars 30582 1726855290.95545: variable 'ansible_distribution_major_version' from source: facts 30582 1726855290.95597: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855290.95823: variable 'profile_stat' from source: set_fact 30582 1726855290.95842: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855290.95849: when evaluation is False, skipping this task 30582 1726855290.95854: _execute() done 30582 1726855290.95860: dumping result to json 30582 1726855290.95866: done dumping result, returning 30582 1726855290.95878: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000094c] 30582 1726855290.95886: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094c skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855290.96046: no more pending results, returning what we have 30582 1726855290.96050: results queue empty 30582 1726855290.96051: checking for any_errors_fatal 30582 1726855290.96062: done checking for any_errors_fatal 30582 1726855290.96063: checking for max_fail_percentage 30582 1726855290.96066: done checking for max_fail_percentage 30582 1726855290.96067: checking to see if all hosts have failed and the running result is not ok 30582 1726855290.96067: done checking to see if all hosts have failed 30582 1726855290.96068: getting the remaining hosts for this loop 30582 1726855290.96069: done getting the remaining hosts for this loop 30582 1726855290.96075: getting the next task for host managed_node3 30582 1726855290.96084: done getting next task for host managed_node3 30582 1726855290.96086: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855290.96092: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855290.96097: getting variables 30582 1726855290.96099: in VariableManager get_vars() 30582 1726855290.96133: Calling all_inventory to load vars for managed_node3 30582 1726855290.96136: Calling groups_inventory to load vars for managed_node3 30582 1726855290.96140: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855290.96154: Calling all_plugins_play to load vars for managed_node3 30582 1726855290.96158: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855290.96162: Calling groups_plugins_play to load vars for managed_node3 30582 1726855290.96892: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094c 30582 1726855290.96895: WORKER PROCESS EXITING 30582 1726855290.97827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.00337: done with get_vars() 30582 1726855291.00362: done getting variables 30582 1726855291.00412: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855291.00505: variable 'profile' from source: play vars 30582 1726855291.00508: variable 'interface' from source: play vars 30582 1726855291.00577: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 14:01:31 -0400 (0:00:00.071) 0:00:27.356 ****** 30582 1726855291.00619: entering _queue_task() for managed_node3/set_fact 30582 1726855291.01017: worker is 1 (out of 1 available) 30582 1726855291.01029: exiting _queue_task() for managed_node3/set_fact 30582 1726855291.01043: done queuing things up, now waiting for results queue to drain 30582 1726855291.01045: waiting for pending results... 30582 1726855291.01335: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30582 1726855291.01471: in run() - task 0affcc66-ac2b-aa83-7d57-00000000094d 30582 1726855291.01484: variable 'ansible_search_path' from source: unknown 30582 1726855291.01489: variable 'ansible_search_path' from source: unknown 30582 1726855291.01520: calling self._execute() 30582 1726855291.01613: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.01616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.01627: variable 'omit' from source: magic vars 30582 1726855291.01918: variable 'ansible_distribution_major_version' from source: facts 30582 1726855291.01927: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855291.02064: variable 'profile_stat' from source: set_fact 30582 1726855291.02103: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855291.02107: when evaluation is False, skipping this task 30582 1726855291.02109: _execute() done 30582 1726855291.02112: dumping result to json 30582 1726855291.02114: done dumping result, returning 30582 1726855291.02117: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000094d] 30582 1726855291.02119: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094d 30582 1726855291.02227: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094d 30582 1726855291.02230: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855291.02317: no more pending results, returning what we have 30582 1726855291.02321: results queue empty 30582 1726855291.02322: checking for any_errors_fatal 30582 1726855291.02328: done checking for any_errors_fatal 30582 1726855291.02329: checking for max_fail_percentage 30582 1726855291.02331: done checking for max_fail_percentage 30582 1726855291.02332: checking to see if all hosts have failed and the running result is not ok 30582 1726855291.02332: done checking to see if all hosts have failed 30582 1726855291.02333: getting the remaining hosts for this loop 30582 1726855291.02335: done getting the remaining hosts for this loop 30582 1726855291.02338: getting the next task for host managed_node3 30582 1726855291.02346: done getting next task for host managed_node3 30582 1726855291.02349: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30582 1726855291.02358: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855291.02365: getting variables 30582 1726855291.02366: in VariableManager get_vars() 30582 1726855291.02403: Calling all_inventory to load vars for managed_node3 30582 1726855291.02406: Calling groups_inventory to load vars for managed_node3 30582 1726855291.02409: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.02418: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.02420: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.02423: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.03782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.09579: done with get_vars() 30582 1726855291.09616: done getting variables 30582 1726855291.09676: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855291.09785: variable 'profile' from source: play vars 30582 1726855291.09790: variable 'interface' from source: play vars 30582 1726855291.09858: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 14:01:31 -0400 (0:00:00.092) 0:00:27.448 ****** 30582 1726855291.09894: entering _queue_task() for managed_node3/command 30582 1726855291.10528: worker is 1 (out of 1 available) 30582 1726855291.10541: exiting _queue_task() for managed_node3/command 30582 1726855291.10550: done queuing things up, now waiting for results queue to drain 30582 1726855291.10552: waiting for pending results... 30582 1726855291.10809: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30582 1726855291.10917: in run() - task 0affcc66-ac2b-aa83-7d57-00000000094e 30582 1726855291.10922: variable 'ansible_search_path' from source: unknown 30582 1726855291.10928: variable 'ansible_search_path' from source: unknown 30582 1726855291.10932: calling self._execute() 30582 1726855291.10982: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.10985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.10999: variable 'omit' from source: magic vars 30582 1726855291.11401: variable 'ansible_distribution_major_version' from source: facts 30582 1726855291.11414: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855291.11537: variable 'profile_stat' from source: set_fact 30582 1726855291.11554: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855291.11668: when evaluation is False, skipping this task 30582 1726855291.11674: _execute() done 30582 1726855291.11677: dumping result to json 30582 1726855291.11680: done dumping result, returning 30582 1726855291.11697: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000094e] 30582 1726855291.11699: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094e 30582 1726855291.11758: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094e 30582 1726855291.11761: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855291.11845: no more pending results, returning what we have 30582 1726855291.11848: results queue empty 30582 1726855291.11849: checking for any_errors_fatal 30582 1726855291.11858: done checking for any_errors_fatal 30582 1726855291.11859: checking for max_fail_percentage 30582 1726855291.11861: done checking for max_fail_percentage 30582 1726855291.11863: checking to see if all hosts have failed and the running result is not ok 30582 1726855291.11863: done checking to see if all hosts have failed 30582 1726855291.11864: getting the remaining hosts for this loop 30582 1726855291.11866: done getting the remaining hosts for this loop 30582 1726855291.11870: getting the next task for host managed_node3 30582 1726855291.11880: done getting next task for host managed_node3 30582 1726855291.11883: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30582 1726855291.11889: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855291.11894: getting variables 30582 1726855291.11896: in VariableManager get_vars() 30582 1726855291.11929: Calling all_inventory to load vars for managed_node3 30582 1726855291.11932: Calling groups_inventory to load vars for managed_node3 30582 1726855291.11935: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.11947: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.11950: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.11953: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.13646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.15504: done with get_vars() 30582 1726855291.15537: done getting variables 30582 1726855291.15619: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855291.15766: variable 'profile' from source: play vars 30582 1726855291.15771: variable 'interface' from source: play vars 30582 1726855291.15859: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 14:01:31 -0400 (0:00:00.059) 0:00:27.508 ****** 30582 1726855291.15897: entering _queue_task() for managed_node3/set_fact 30582 1726855291.16521: worker is 1 (out of 1 available) 30582 1726855291.16531: exiting _queue_task() for managed_node3/set_fact 30582 1726855291.16554: done queuing things up, now waiting for results queue to drain 30582 1726855291.16556: waiting for pending results... 30582 1726855291.16900: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30582 1726855291.16907: in run() - task 0affcc66-ac2b-aa83-7d57-00000000094f 30582 1726855291.17090: variable 'ansible_search_path' from source: unknown 30582 1726855291.17095: variable 'ansible_search_path' from source: unknown 30582 1726855291.17234: calling self._execute() 30582 1726855291.17261: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.17270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.17276: variable 'omit' from source: magic vars 30582 1726855291.17639: variable 'ansible_distribution_major_version' from source: facts 30582 1726855291.17648: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855291.17956: variable 'profile_stat' from source: set_fact 30582 1726855291.17961: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855291.17968: when evaluation is False, skipping this task 30582 1726855291.17979: _execute() done 30582 1726855291.17992: dumping result to json 30582 1726855291.17999: done dumping result, returning 30582 1726855291.18009: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000094f] 30582 1726855291.18017: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094f 30582 1726855291.18694: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000094f skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855291.18735: no more pending results, returning what we have 30582 1726855291.18739: results queue empty 30582 1726855291.18740: checking for any_errors_fatal 30582 1726855291.18744: done checking for any_errors_fatal 30582 1726855291.18745: checking for max_fail_percentage 30582 1726855291.18749: done checking for max_fail_percentage 30582 1726855291.18750: checking to see if all hosts have failed and the running result is not ok 30582 1726855291.18751: done checking to see if all hosts have failed 30582 1726855291.18752: getting the remaining hosts for this loop 30582 1726855291.18753: done getting the remaining hosts for this loop 30582 1726855291.18760: getting the next task for host managed_node3 30582 1726855291.18770: done getting next task for host managed_node3 30582 1726855291.18775: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30582 1726855291.18778: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855291.18783: getting variables 30582 1726855291.18784: in VariableManager get_vars() 30582 1726855291.18818: Calling all_inventory to load vars for managed_node3 30582 1726855291.18821: Calling groups_inventory to load vars for managed_node3 30582 1726855291.18825: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.18835: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.18838: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.18840: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.19401: WORKER PROCESS EXITING 30582 1726855291.20453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.23386: done with get_vars() 30582 1726855291.23420: done getting variables 30582 1726855291.23695: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855291.23825: variable 'profile' from source: play vars 30582 1726855291.23829: variable 'interface' from source: play vars 30582 1726855291.24099: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 14:01:31 -0400 (0:00:00.082) 0:00:27.591 ****** 30582 1726855291.24132: entering _queue_task() for managed_node3/assert 30582 1726855291.24933: worker is 1 (out of 1 available) 30582 1726855291.24945: exiting _queue_task() for managed_node3/assert 30582 1726855291.24956: done queuing things up, now waiting for results queue to drain 30582 1726855291.24958: waiting for pending results... 30582 1726855291.25456: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' 30582 1726855291.25796: in run() - task 0affcc66-ac2b-aa83-7d57-0000000008ae 30582 1726855291.25803: variable 'ansible_search_path' from source: unknown 30582 1726855291.25893: variable 'ansible_search_path' from source: unknown 30582 1726855291.25932: calling self._execute() 30582 1726855291.26184: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.26393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.26397: variable 'omit' from source: magic vars 30582 1726855291.27637: variable 'ansible_distribution_major_version' from source: facts 30582 1726855291.27697: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855291.27811: variable 'omit' from source: magic vars 30582 1726855291.28105: variable 'omit' from source: magic vars 30582 1726855291.28565: variable 'profile' from source: play vars 30582 1726855291.28675: variable 'interface' from source: play vars 30582 1726855291.28797: variable 'interface' from source: play vars 30582 1726855291.28900: variable 'omit' from source: magic vars 30582 1726855291.29167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855291.29172: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855291.29281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855291.29395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.29399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.29490: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855291.29569: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.29615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.30018: Set connection var ansible_timeout to 10 30582 1726855291.30047: Set connection var ansible_connection to ssh 30582 1726855291.30267: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855291.30271: Set connection var ansible_pipelining to False 30582 1726855291.30276: Set connection var ansible_shell_executable to /bin/sh 30582 1726855291.30278: Set connection var ansible_shell_type to sh 30582 1726855291.30282: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.30284: variable 'ansible_connection' from source: unknown 30582 1726855291.30286: variable 'ansible_module_compression' from source: unknown 30582 1726855291.30289: variable 'ansible_shell_type' from source: unknown 30582 1726855291.30291: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.30293: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.30295: variable 'ansible_pipelining' from source: unknown 30582 1726855291.30297: variable 'ansible_timeout' from source: unknown 30582 1726855291.30300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.30591: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855291.30609: variable 'omit' from source: magic vars 30582 1726855291.30794: starting attempt loop 30582 1726855291.30800: running the handler 30582 1726855291.30961: variable 'lsr_net_profile_exists' from source: set_fact 30582 1726855291.30977: Evaluated conditional (lsr_net_profile_exists): True 30582 1726855291.30993: handler run complete 30582 1726855291.31016: attempt loop complete, returning result 30582 1726855291.31134: _execute() done 30582 1726855291.31137: dumping result to json 30582 1726855291.31140: done dumping result, returning 30582 1726855291.31142: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' [0affcc66-ac2b-aa83-7d57-0000000008ae] 30582 1726855291.31144: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008ae 30582 1726855291.31351: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008ae 30582 1726855291.31357: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855291.31411: no more pending results, returning what we have 30582 1726855291.31415: results queue empty 30582 1726855291.31416: checking for any_errors_fatal 30582 1726855291.31423: done checking for any_errors_fatal 30582 1726855291.31424: checking for max_fail_percentage 30582 1726855291.31426: done checking for max_fail_percentage 30582 1726855291.31427: checking to see if all hosts have failed and the running result is not ok 30582 1726855291.31428: done checking to see if all hosts have failed 30582 1726855291.31429: getting the remaining hosts for this loop 30582 1726855291.31430: done getting the remaining hosts for this loop 30582 1726855291.31434: getting the next task for host managed_node3 30582 1726855291.31443: done getting next task for host managed_node3 30582 1726855291.31446: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30582 1726855291.31450: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855291.31455: getting variables 30582 1726855291.31457: in VariableManager get_vars() 30582 1726855291.31493: Calling all_inventory to load vars for managed_node3 30582 1726855291.31496: Calling groups_inventory to load vars for managed_node3 30582 1726855291.31499: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.31510: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.31513: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.31515: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.34754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.38038: done with get_vars() 30582 1726855291.38069: done getting variables 30582 1726855291.38132: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855291.38254: variable 'profile' from source: play vars 30582 1726855291.38258: variable 'interface' from source: play vars 30582 1726855291.38456: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 14:01:31 -0400 (0:00:00.144) 0:00:27.735 ****** 30582 1726855291.38536: entering _queue_task() for managed_node3/assert 30582 1726855291.38935: worker is 1 (out of 1 available) 30582 1726855291.38949: exiting _queue_task() for managed_node3/assert 30582 1726855291.38962: done queuing things up, now waiting for results queue to drain 30582 1726855291.38964: waiting for pending results... 30582 1726855291.39283: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' 30582 1726855291.39483: in run() - task 0affcc66-ac2b-aa83-7d57-0000000008af 30582 1726855291.39490: variable 'ansible_search_path' from source: unknown 30582 1726855291.39493: variable 'ansible_search_path' from source: unknown 30582 1726855291.39504: calling self._execute() 30582 1726855291.39615: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.39627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.39641: variable 'omit' from source: magic vars 30582 1726855291.40056: variable 'ansible_distribution_major_version' from source: facts 30582 1726855291.40133: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855291.40137: variable 'omit' from source: magic vars 30582 1726855291.40144: variable 'omit' from source: magic vars 30582 1726855291.40256: variable 'profile' from source: play vars 30582 1726855291.40267: variable 'interface' from source: play vars 30582 1726855291.40336: variable 'interface' from source: play vars 30582 1726855291.40369: variable 'omit' from source: magic vars 30582 1726855291.40439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855291.40493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855291.40536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855291.40681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.40684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.40694: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855291.40698: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.40701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.41017: Set connection var ansible_timeout to 10 30582 1726855291.41095: Set connection var ansible_connection to ssh 30582 1726855291.41133: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855291.41154: Set connection var ansible_pipelining to False 30582 1726855291.41219: Set connection var ansible_shell_executable to /bin/sh 30582 1726855291.41227: Set connection var ansible_shell_type to sh 30582 1726855291.41244: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.41257: variable 'ansible_connection' from source: unknown 30582 1726855291.41265: variable 'ansible_module_compression' from source: unknown 30582 1726855291.41272: variable 'ansible_shell_type' from source: unknown 30582 1726855291.41336: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.41340: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.41343: variable 'ansible_pipelining' from source: unknown 30582 1726855291.41363: variable 'ansible_timeout' from source: unknown 30582 1726855291.41366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.41530: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855291.41582: variable 'omit' from source: magic vars 30582 1726855291.41588: starting attempt loop 30582 1726855291.41592: running the handler 30582 1726855291.41714: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30582 1726855291.41725: Evaluated conditional (lsr_net_profile_ansible_managed): True 30582 1726855291.41763: handler run complete 30582 1726855291.41770: attempt loop complete, returning result 30582 1726855291.41772: _execute() done 30582 1726855291.41777: dumping result to json 30582 1726855291.41779: done dumping result, returning 30582 1726855291.41785: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' [0affcc66-ac2b-aa83-7d57-0000000008af] 30582 1726855291.41802: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008af 30582 1726855291.41947: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008af 30582 1726855291.41950: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855291.42034: no more pending results, returning what we have 30582 1726855291.42038: results queue empty 30582 1726855291.42039: checking for any_errors_fatal 30582 1726855291.42047: done checking for any_errors_fatal 30582 1726855291.42048: checking for max_fail_percentage 30582 1726855291.42050: done checking for max_fail_percentage 30582 1726855291.42051: checking to see if all hosts have failed and the running result is not ok 30582 1726855291.42052: done checking to see if all hosts have failed 30582 1726855291.42053: getting the remaining hosts for this loop 30582 1726855291.42054: done getting the remaining hosts for this loop 30582 1726855291.42058: getting the next task for host managed_node3 30582 1726855291.42067: done getting next task for host managed_node3 30582 1726855291.42071: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30582 1726855291.42077: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855291.42082: getting variables 30582 1726855291.42084: in VariableManager get_vars() 30582 1726855291.42120: Calling all_inventory to load vars for managed_node3 30582 1726855291.42123: Calling groups_inventory to load vars for managed_node3 30582 1726855291.42127: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.42139: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.42142: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.42145: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.44136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.45854: done with get_vars() 30582 1726855291.45888: done getting variables 30582 1726855291.45958: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855291.46092: variable 'profile' from source: play vars 30582 1726855291.46096: variable 'interface' from source: play vars 30582 1726855291.46169: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 14:01:31 -0400 (0:00:00.076) 0:00:27.812 ****** 30582 1726855291.46210: entering _queue_task() for managed_node3/assert 30582 1726855291.46641: worker is 1 (out of 1 available) 30582 1726855291.46653: exiting _queue_task() for managed_node3/assert 30582 1726855291.46780: done queuing things up, now waiting for results queue to drain 30582 1726855291.46782: waiting for pending results... 30582 1726855291.47111: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr 30582 1726855291.47118: in run() - task 0affcc66-ac2b-aa83-7d57-0000000008b0 30582 1726855291.47126: variable 'ansible_search_path' from source: unknown 30582 1726855291.47133: variable 'ansible_search_path' from source: unknown 30582 1726855291.47176: calling self._execute() 30582 1726855291.47291: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.47302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.47328: variable 'omit' from source: magic vars 30582 1726855291.48236: variable 'ansible_distribution_major_version' from source: facts 30582 1726855291.48240: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855291.48243: variable 'omit' from source: magic vars 30582 1726855291.48280: variable 'omit' from source: magic vars 30582 1726855291.48584: variable 'profile' from source: play vars 30582 1726855291.48599: variable 'interface' from source: play vars 30582 1726855291.48737: variable 'interface' from source: play vars 30582 1726855291.48776: variable 'omit' from source: magic vars 30582 1726855291.48894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855291.48935: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855291.48966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855291.49002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.49063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.49068: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855291.49094: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.49102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.49232: Set connection var ansible_timeout to 10 30582 1726855291.49239: Set connection var ansible_connection to ssh 30582 1726855291.49277: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855291.49280: Set connection var ansible_pipelining to False 30582 1726855291.49282: Set connection var ansible_shell_executable to /bin/sh 30582 1726855291.49285: Set connection var ansible_shell_type to sh 30582 1726855291.49310: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.49327: variable 'ansible_connection' from source: unknown 30582 1726855291.49330: variable 'ansible_module_compression' from source: unknown 30582 1726855291.49332: variable 'ansible_shell_type' from source: unknown 30582 1726855291.49385: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.49389: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.49392: variable 'ansible_pipelining' from source: unknown 30582 1726855291.49394: variable 'ansible_timeout' from source: unknown 30582 1726855291.49396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.49526: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855291.49548: variable 'omit' from source: magic vars 30582 1726855291.49558: starting attempt loop 30582 1726855291.49564: running the handler 30582 1726855291.49699: variable 'lsr_net_profile_fingerprint' from source: set_fact 30582 1726855291.49763: Evaluated conditional (lsr_net_profile_fingerprint): True 30582 1726855291.49766: handler run complete 30582 1726855291.49768: attempt loop complete, returning result 30582 1726855291.49770: _execute() done 30582 1726855291.49775: dumping result to json 30582 1726855291.49778: done dumping result, returning 30582 1726855291.49780: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr [0affcc66-ac2b-aa83-7d57-0000000008b0] 30582 1726855291.49782: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008b0 30582 1726855291.50034: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000008b0 30582 1726855291.50038: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855291.50099: no more pending results, returning what we have 30582 1726855291.50103: results queue empty 30582 1726855291.50104: checking for any_errors_fatal 30582 1726855291.50112: done checking for any_errors_fatal 30582 1726855291.50113: checking for max_fail_percentage 30582 1726855291.50115: done checking for max_fail_percentage 30582 1726855291.50116: checking to see if all hosts have failed and the running result is not ok 30582 1726855291.50117: done checking to see if all hosts have failed 30582 1726855291.50118: getting the remaining hosts for this loop 30582 1726855291.50119: done getting the remaining hosts for this loop 30582 1726855291.50123: getting the next task for host managed_node3 30582 1726855291.50134: done getting next task for host managed_node3 30582 1726855291.50138: ^ task is: TASK: Conditional asserts 30582 1726855291.50142: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855291.50146: getting variables 30582 1726855291.50148: in VariableManager get_vars() 30582 1726855291.50189: Calling all_inventory to load vars for managed_node3 30582 1726855291.50192: Calling groups_inventory to load vars for managed_node3 30582 1726855291.50196: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.50209: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.50212: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.50215: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.52848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.55236: done with get_vars() 30582 1726855291.55266: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 14:01:31 -0400 (0:00:00.091) 0:00:27.903 ****** 30582 1726855291.55371: entering _queue_task() for managed_node3/include_tasks 30582 1726855291.55845: worker is 1 (out of 1 available) 30582 1726855291.55858: exiting _queue_task() for managed_node3/include_tasks 30582 1726855291.55872: done queuing things up, now waiting for results queue to drain 30582 1726855291.55876: waiting for pending results... 30582 1726855291.56244: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30582 1726855291.56394: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005ba 30582 1726855291.56398: variable 'ansible_search_path' from source: unknown 30582 1726855291.56401: variable 'ansible_search_path' from source: unknown 30582 1726855291.56779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855291.59278: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855291.59339: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855291.59489: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855291.59493: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855291.59495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855291.59569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855291.59622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855291.59655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855291.59706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855291.59742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855291.59943: dumping result to json 30582 1726855291.59946: done dumping result, returning 30582 1726855291.59949: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcc66-ac2b-aa83-7d57-0000000005ba] 30582 1726855291.59951: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005ba skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 30582 1726855291.60124: no more pending results, returning what we have 30582 1726855291.60128: results queue empty 30582 1726855291.60130: checking for any_errors_fatal 30582 1726855291.60135: done checking for any_errors_fatal 30582 1726855291.60136: checking for max_fail_percentage 30582 1726855291.60138: done checking for max_fail_percentage 30582 1726855291.60139: checking to see if all hosts have failed and the running result is not ok 30582 1726855291.60140: done checking to see if all hosts have failed 30582 1726855291.60141: getting the remaining hosts for this loop 30582 1726855291.60143: done getting the remaining hosts for this loop 30582 1726855291.60147: getting the next task for host managed_node3 30582 1726855291.60155: done getting next task for host managed_node3 30582 1726855291.60158: ^ task is: TASK: Success in test '{{ lsr_description }}' 30582 1726855291.60161: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855291.60164: getting variables 30582 1726855291.60166: in VariableManager get_vars() 30582 1726855291.60423: Calling all_inventory to load vars for managed_node3 30582 1726855291.60426: Calling groups_inventory to load vars for managed_node3 30582 1726855291.60430: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.60441: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.60445: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.60449: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.61033: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005ba 30582 1726855291.61037: WORKER PROCESS EXITING 30582 1726855291.62090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.63803: done with get_vars() 30582 1726855291.63830: done getting variables 30582 1726855291.63907: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855291.64035: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile without autoconnect'] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 14:01:31 -0400 (0:00:00.086) 0:00:27.990 ****** 30582 1726855291.64066: entering _queue_task() for managed_node3/debug 30582 1726855291.64555: worker is 1 (out of 1 available) 30582 1726855291.64568: exiting _queue_task() for managed_node3/debug 30582 1726855291.64582: done queuing things up, now waiting for results queue to drain 30582 1726855291.64584: waiting for pending results... 30582 1726855291.64797: running TaskExecutor() for managed_node3/TASK: Success in test 'I can create a profile without autoconnect' 30582 1726855291.64930: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005bb 30582 1726855291.64964: variable 'ansible_search_path' from source: unknown 30582 1726855291.64968: variable 'ansible_search_path' from source: unknown 30582 1726855291.65033: calling self._execute() 30582 1726855291.65126: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.65183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.65188: variable 'omit' from source: magic vars 30582 1726855291.65584: variable 'ansible_distribution_major_version' from source: facts 30582 1726855291.65605: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855291.65623: variable 'omit' from source: magic vars 30582 1726855291.65664: variable 'omit' from source: magic vars 30582 1726855291.65797: variable 'lsr_description' from source: include params 30582 1726855291.65816: variable 'omit' from source: magic vars 30582 1726855291.65905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855291.65916: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855291.65948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855291.65971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.65995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.66038: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855291.66093: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.66097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.66198: Set connection var ansible_timeout to 10 30582 1726855291.66206: Set connection var ansible_connection to ssh 30582 1726855291.66219: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855291.66237: Set connection var ansible_pipelining to False 30582 1726855291.66249: Set connection var ansible_shell_executable to /bin/sh 30582 1726855291.66255: Set connection var ansible_shell_type to sh 30582 1726855291.66341: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.66345: variable 'ansible_connection' from source: unknown 30582 1726855291.66347: variable 'ansible_module_compression' from source: unknown 30582 1726855291.66349: variable 'ansible_shell_type' from source: unknown 30582 1726855291.66351: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.66353: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.66355: variable 'ansible_pipelining' from source: unknown 30582 1726855291.66357: variable 'ansible_timeout' from source: unknown 30582 1726855291.66359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.66502: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855291.66521: variable 'omit' from source: magic vars 30582 1726855291.66533: starting attempt loop 30582 1726855291.66541: running the handler 30582 1726855291.66606: handler run complete 30582 1726855291.66669: attempt loop complete, returning result 30582 1726855291.66672: _execute() done 30582 1726855291.66678: dumping result to json 30582 1726855291.66680: done dumping result, returning 30582 1726855291.66682: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can create a profile without autoconnect' [0affcc66-ac2b-aa83-7d57-0000000005bb] 30582 1726855291.66684: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005bb ok: [managed_node3] => {} MSG: +++++ Success in test 'I can create a profile without autoconnect' +++++ 30582 1726855291.66831: no more pending results, returning what we have 30582 1726855291.66835: results queue empty 30582 1726855291.66836: checking for any_errors_fatal 30582 1726855291.66845: done checking for any_errors_fatal 30582 1726855291.66846: checking for max_fail_percentage 30582 1726855291.66848: done checking for max_fail_percentage 30582 1726855291.66849: checking to see if all hosts have failed and the running result is not ok 30582 1726855291.66850: done checking to see if all hosts have failed 30582 1726855291.66851: getting the remaining hosts for this loop 30582 1726855291.66852: done getting the remaining hosts for this loop 30582 1726855291.66856: getting the next task for host managed_node3 30582 1726855291.66865: done getting next task for host managed_node3 30582 1726855291.66868: ^ task is: TASK: Cleanup 30582 1726855291.66871: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855291.66883: getting variables 30582 1726855291.66885: in VariableManager get_vars() 30582 1726855291.66920: Calling all_inventory to load vars for managed_node3 30582 1726855291.66923: Calling groups_inventory to load vars for managed_node3 30582 1726855291.66927: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.66939: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.66943: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.66946: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.67625: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005bb 30582 1726855291.67629: WORKER PROCESS EXITING 30582 1726855291.70877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.75195: done with get_vars() 30582 1726855291.75226: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 14:01:31 -0400 (0:00:00.112) 0:00:28.103 ****** 30582 1726855291.75334: entering _queue_task() for managed_node3/include_tasks 30582 1726855291.76247: worker is 1 (out of 1 available) 30582 1726855291.76263: exiting _queue_task() for managed_node3/include_tasks 30582 1726855291.76275: done queuing things up, now waiting for results queue to drain 30582 1726855291.76277: waiting for pending results... 30582 1726855291.76686: running TaskExecutor() for managed_node3/TASK: Cleanup 30582 1726855291.76777: in run() - task 0affcc66-ac2b-aa83-7d57-0000000005bf 30582 1726855291.76796: variable 'ansible_search_path' from source: unknown 30582 1726855291.76800: variable 'ansible_search_path' from source: unknown 30582 1726855291.76940: variable 'lsr_cleanup' from source: include params 30582 1726855291.77468: variable 'lsr_cleanup' from source: include params 30582 1726855291.77535: variable 'omit' from source: magic vars 30582 1726855291.77667: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.77676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.77683: variable 'omit' from source: magic vars 30582 1726855291.77920: variable 'ansible_distribution_major_version' from source: facts 30582 1726855291.77930: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855291.77935: variable 'item' from source: unknown 30582 1726855291.77997: variable 'item' from source: unknown 30582 1726855291.78027: variable 'item' from source: unknown 30582 1726855291.78084: variable 'item' from source: unknown 30582 1726855291.78305: dumping result to json 30582 1726855291.78308: done dumping result, returning 30582 1726855291.78316: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcc66-ac2b-aa83-7d57-0000000005bf] 30582 1726855291.78319: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005bf 30582 1726855291.78355: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000005bf 30582 1726855291.78357: WORKER PROCESS EXITING 30582 1726855291.78434: no more pending results, returning what we have 30582 1726855291.78439: in VariableManager get_vars() 30582 1726855291.78469: Calling all_inventory to load vars for managed_node3 30582 1726855291.78471: Calling groups_inventory to load vars for managed_node3 30582 1726855291.78477: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.78486: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.78491: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.78494: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.79896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.83450: done with get_vars() 30582 1726855291.83477: variable 'ansible_search_path' from source: unknown 30582 1726855291.83479: variable 'ansible_search_path' from source: unknown 30582 1726855291.83732: we have included files to process 30582 1726855291.83733: generating all_blocks data 30582 1726855291.83736: done generating all_blocks data 30582 1726855291.83741: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855291.83742: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855291.83745: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855291.84315: done processing included file 30582 1726855291.84318: iterating over new_blocks loaded from include file 30582 1726855291.84320: in VariableManager get_vars() 30582 1726855291.84336: done with get_vars() 30582 1726855291.84338: filtering new block on tags 30582 1726855291.84546: done filtering new block on tags 30582 1726855291.84549: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30582 1726855291.84554: extending task lists for all hosts with included blocks 30582 1726855291.87697: done extending task lists 30582 1726855291.87699: done processing included files 30582 1726855291.87700: results queue empty 30582 1726855291.87701: checking for any_errors_fatal 30582 1726855291.87705: done checking for any_errors_fatal 30582 1726855291.87706: checking for max_fail_percentage 30582 1726855291.87707: done checking for max_fail_percentage 30582 1726855291.87708: checking to see if all hosts have failed and the running result is not ok 30582 1726855291.87709: done checking to see if all hosts have failed 30582 1726855291.87709: getting the remaining hosts for this loop 30582 1726855291.87711: done getting the remaining hosts for this loop 30582 1726855291.87713: getting the next task for host managed_node3 30582 1726855291.87718: done getting next task for host managed_node3 30582 1726855291.87720: ^ task is: TASK: Cleanup profile and device 30582 1726855291.87723: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855291.87726: getting variables 30582 1726855291.87727: in VariableManager get_vars() 30582 1726855291.87741: Calling all_inventory to load vars for managed_node3 30582 1726855291.87743: Calling groups_inventory to load vars for managed_node3 30582 1726855291.87746: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855291.87752: Calling all_plugins_play to load vars for managed_node3 30582 1726855291.87754: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855291.87757: Calling groups_plugins_play to load vars for managed_node3 30582 1726855291.90709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855291.93906: done with get_vars() 30582 1726855291.93945: done getting variables 30582 1726855291.93997: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 14:01:31 -0400 (0:00:00.186) 0:00:28.290 ****** 30582 1726855291.94039: entering _queue_task() for managed_node3/shell 30582 1726855291.94441: worker is 1 (out of 1 available) 30582 1726855291.94456: exiting _queue_task() for managed_node3/shell 30582 1726855291.94613: done queuing things up, now waiting for results queue to drain 30582 1726855291.94615: waiting for pending results... 30582 1726855291.94913: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30582 1726855291.94960: in run() - task 0affcc66-ac2b-aa83-7d57-0000000009a0 30582 1726855291.94982: variable 'ansible_search_path' from source: unknown 30582 1726855291.94993: variable 'ansible_search_path' from source: unknown 30582 1726855291.95063: calling self._execute() 30582 1726855291.95168: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.95233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.95237: variable 'omit' from source: magic vars 30582 1726855291.95705: variable 'ansible_distribution_major_version' from source: facts 30582 1726855291.95721: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855291.95733: variable 'omit' from source: magic vars 30582 1726855291.95944: variable 'omit' from source: magic vars 30582 1726855291.96331: variable 'interface' from source: play vars 30582 1726855291.96334: variable 'omit' from source: magic vars 30582 1726855291.96336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855291.96495: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855291.96501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855291.96526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.96615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855291.96650: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855291.96662: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.96670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.96984: Set connection var ansible_timeout to 10 30582 1726855291.96989: Set connection var ansible_connection to ssh 30582 1726855291.96991: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855291.96995: Set connection var ansible_pipelining to False 30582 1726855291.97006: Set connection var ansible_shell_executable to /bin/sh 30582 1726855291.97013: Set connection var ansible_shell_type to sh 30582 1726855291.97146: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.97150: variable 'ansible_connection' from source: unknown 30582 1726855291.97152: variable 'ansible_module_compression' from source: unknown 30582 1726855291.97154: variable 'ansible_shell_type' from source: unknown 30582 1726855291.97156: variable 'ansible_shell_executable' from source: unknown 30582 1726855291.97159: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855291.97161: variable 'ansible_pipelining' from source: unknown 30582 1726855291.97163: variable 'ansible_timeout' from source: unknown 30582 1726855291.97165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855291.97529: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855291.97532: variable 'omit' from source: magic vars 30582 1726855291.97534: starting attempt loop 30582 1726855291.97537: running the handler 30582 1726855291.97539: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855291.97590: _low_level_execute_command(): starting 30582 1726855291.97605: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855291.99139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855291.99156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855291.99315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855291.99330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855291.99411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855291.99609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855291.99859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855292.01522: stdout chunk (state=3): >>>/root <<< 30582 1726855292.01710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855292.01727: stdout chunk (state=3): >>><<< 30582 1726855292.01734: stderr chunk (state=3): >>><<< 30582 1726855292.01762: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855292.02056: _low_level_execute_command(): starting 30582 1726855292.02060: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123 `" && echo ansible-tmp-1726855292.0182273-31912-110175299421123="` echo /root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123 `" ) && sleep 0' 30582 1726855292.03138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855292.03303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855292.03318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855292.03338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855292.03424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855292.03481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855292.05797: stdout chunk (state=3): >>>ansible-tmp-1726855292.0182273-31912-110175299421123=/root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123 <<< 30582 1726855292.05801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855292.05804: stdout chunk (state=3): >>><<< 30582 1726855292.05806: stderr chunk (state=3): >>><<< 30582 1726855292.05809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855292.0182273-31912-110175299421123=/root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855292.05811: variable 'ansible_module_compression' from source: unknown 30582 1726855292.05860: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855292.06037: variable 'ansible_facts' from source: unknown 30582 1726855292.06071: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/AnsiballZ_command.py 30582 1726855292.06621: Sending initial data 30582 1726855292.06625: Sent initial data (156 bytes) 30582 1726855292.07686: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855292.07903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855292.07920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855292.08258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855292.08262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855292.08265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855292.09794: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855292.09798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855292.09838: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp9txod0tk /root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/AnsiballZ_command.py <<< 30582 1726855292.09842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/AnsiballZ_command.py" <<< 30582 1726855292.10120: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp9txod0tk" to remote "/root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/AnsiballZ_command.py" <<< 30582 1726855292.11393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855292.11468: stderr chunk (state=3): >>><<< 30582 1726855292.11471: stdout chunk (state=3): >>><<< 30582 1726855292.11501: done transferring module to remote 30582 1726855292.11529: _low_level_execute_command(): starting 30582 1726855292.11533: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/ /root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/AnsiballZ_command.py && sleep 0' 30582 1726855292.12757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855292.12896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855292.12899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855292.12902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855292.12904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855292.12906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855292.12908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855292.12910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855292.13178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855292.15026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855292.15079: stderr chunk (state=3): >>><<< 30582 1726855292.15194: stdout chunk (state=3): >>><<< 30582 1726855292.15211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855292.15219: _low_level_execute_command(): starting 30582 1726855292.15446: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/AnsiballZ_command.py && sleep 0' 30582 1726855292.16560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855292.16568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855292.16586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855292.16596: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855292.16608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855292.16622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855292.16778: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855292.16893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855292.16990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855292.36344: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (32e4a47f-d12d-469b-92d8-81cf9f125a33) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:01:32.327813", "end": "2024-09-20 14:01:32.362177", "delta": "0:00:00.034364", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855292.38310: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. <<< 30582 1726855292.38314: stdout chunk (state=3): >>><<< 30582 1726855292.38348: stderr chunk (state=3): >>><<< 30582 1726855292.38352: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (32e4a47f-d12d-469b-92d8-81cf9f125a33) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:01:32.327813", "end": "2024-09-20 14:01:32.362177", "delta": "0:00:00.034364", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855292.38379: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855292.38457: _low_level_execute_command(): starting 30582 1726855292.38461: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855292.0182273-31912-110175299421123/ > /dev/null 2>&1 && sleep 0' 30582 1726855292.39703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855292.39726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855292.39743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855292.39765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855292.39796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855292.39995: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855292.40038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855292.40233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855292.40581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855292.42390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855292.42423: stdout chunk (state=3): >>><<< 30582 1726855292.42427: stderr chunk (state=3): >>><<< 30582 1726855292.42445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855292.42511: handler run complete 30582 1726855292.42515: Evaluated conditional (False): False 30582 1726855292.42565: attempt loop complete, returning result 30582 1726855292.42577: _execute() done 30582 1726855292.42586: dumping result to json 30582 1726855292.42600: done dumping result, returning 30582 1726855292.42729: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcc66-ac2b-aa83-7d57-0000000009a0] 30582 1726855292.42801: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000009a0 30582 1726855292.42883: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000009a0 fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.034364", "end": "2024-09-20 14:01:32.362177", "rc": 1, "start": "2024-09-20 14:01:32.327813" } STDOUT: Connection 'statebr' (32e4a47f-d12d-469b-92d8-81cf9f125a33) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30582 1726855292.42982: no more pending results, returning what we have 30582 1726855292.42989: results queue empty 30582 1726855292.42991: checking for any_errors_fatal 30582 1726855292.42993: done checking for any_errors_fatal 30582 1726855292.42994: checking for max_fail_percentage 30582 1726855292.43200: done checking for max_fail_percentage 30582 1726855292.43202: checking to see if all hosts have failed and the running result is not ok 30582 1726855292.43203: done checking to see if all hosts have failed 30582 1726855292.43204: getting the remaining hosts for this loop 30582 1726855292.43206: done getting the remaining hosts for this loop 30582 1726855292.43210: getting the next task for host managed_node3 30582 1726855292.43222: done getting next task for host managed_node3 30582 1726855292.43226: ^ task is: TASK: Include the task 'run_test.yml' 30582 1726855292.43228: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855292.43232: getting variables 30582 1726855292.43234: in VariableManager get_vars() 30582 1726855292.43269: Calling all_inventory to load vars for managed_node3 30582 1726855292.43275: Calling groups_inventory to load vars for managed_node3 30582 1726855292.43279: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855292.43500: Calling all_plugins_play to load vars for managed_node3 30582 1726855292.43504: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855292.43510: WORKER PROCESS EXITING 30582 1726855292.43514: Calling groups_plugins_play to load vars for managed_node3 30582 1726855292.46086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855292.49608: done with get_vars() 30582 1726855292.49630: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:65 Friday 20 September 2024 14:01:32 -0400 (0:00:00.558) 0:00:28.848 ****** 30582 1726855292.49854: entering _queue_task() for managed_node3/include_tasks 30582 1726855292.50434: worker is 1 (out of 1 available) 30582 1726855292.50446: exiting _queue_task() for managed_node3/include_tasks 30582 1726855292.50458: done queuing things up, now waiting for results queue to drain 30582 1726855292.50459: waiting for pending results... 30582 1726855292.50664: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30582 1726855292.50781: in run() - task 0affcc66-ac2b-aa83-7d57-000000000011 30582 1726855292.50833: variable 'ansible_search_path' from source: unknown 30582 1726855292.50892: calling self._execute() 30582 1726855292.51055: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.51089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.51133: variable 'omit' from source: magic vars 30582 1726855292.52220: variable 'ansible_distribution_major_version' from source: facts 30582 1726855292.52242: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855292.52255: _execute() done 30582 1726855292.52379: dumping result to json 30582 1726855292.52384: done dumping result, returning 30582 1726855292.52389: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcc66-ac2b-aa83-7d57-000000000011] 30582 1726855292.52391: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000011 30582 1726855292.52635: no more pending results, returning what we have 30582 1726855292.52642: in VariableManager get_vars() 30582 1726855292.52694: Calling all_inventory to load vars for managed_node3 30582 1726855292.52698: Calling groups_inventory to load vars for managed_node3 30582 1726855292.52702: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855292.52722: Calling all_plugins_play to load vars for managed_node3 30582 1726855292.52727: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855292.52730: Calling groups_plugins_play to load vars for managed_node3 30582 1726855292.53333: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000011 30582 1726855292.53337: WORKER PROCESS EXITING 30582 1726855292.55569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855292.58684: done with get_vars() 30582 1726855292.58717: variable 'ansible_search_path' from source: unknown 30582 1726855292.58735: we have included files to process 30582 1726855292.58736: generating all_blocks data 30582 1726855292.58738: done generating all_blocks data 30582 1726855292.58746: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855292.58747: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855292.58750: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855292.59216: in VariableManager get_vars() 30582 1726855292.59233: done with get_vars() 30582 1726855292.59276: in VariableManager get_vars() 30582 1726855292.59293: done with get_vars() 30582 1726855292.59329: in VariableManager get_vars() 30582 1726855292.59343: done with get_vars() 30582 1726855292.59384: in VariableManager get_vars() 30582 1726855292.59401: done with get_vars() 30582 1726855292.59437: in VariableManager get_vars() 30582 1726855292.59451: done with get_vars() 30582 1726855292.59825: in VariableManager get_vars() 30582 1726855292.59839: done with get_vars() 30582 1726855292.59849: done processing included file 30582 1726855292.59851: iterating over new_blocks loaded from include file 30582 1726855292.59852: in VariableManager get_vars() 30582 1726855292.59862: done with get_vars() 30582 1726855292.59863: filtering new block on tags 30582 1726855292.59964: done filtering new block on tags 30582 1726855292.59967: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30582 1726855292.59975: extending task lists for all hosts with included blocks 30582 1726855292.60017: done extending task lists 30582 1726855292.60018: done processing included files 30582 1726855292.60019: results queue empty 30582 1726855292.60020: checking for any_errors_fatal 30582 1726855292.60024: done checking for any_errors_fatal 30582 1726855292.60025: checking for max_fail_percentage 30582 1726855292.60026: done checking for max_fail_percentage 30582 1726855292.60027: checking to see if all hosts have failed and the running result is not ok 30582 1726855292.60028: done checking to see if all hosts have failed 30582 1726855292.60028: getting the remaining hosts for this loop 30582 1726855292.60029: done getting the remaining hosts for this loop 30582 1726855292.60032: getting the next task for host managed_node3 30582 1726855292.60036: done getting next task for host managed_node3 30582 1726855292.60038: ^ task is: TASK: TEST: {{ lsr_description }} 30582 1726855292.60040: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855292.60043: getting variables 30582 1726855292.60044: in VariableManager get_vars() 30582 1726855292.60052: Calling all_inventory to load vars for managed_node3 30582 1726855292.60054: Calling groups_inventory to load vars for managed_node3 30582 1726855292.60057: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855292.60062: Calling all_plugins_play to load vars for managed_node3 30582 1726855292.60065: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855292.60067: Calling groups_plugins_play to load vars for managed_node3 30582 1726855292.62276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855292.63493: done with get_vars() 30582 1726855292.63521: done getting variables 30582 1726855292.63573: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855292.63682: variable 'lsr_description' from source: include params TASK [TEST: I can activate an existing profile] ******************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 14:01:32 -0400 (0:00:00.138) 0:00:28.987 ****** 30582 1726855292.63720: entering _queue_task() for managed_node3/debug 30582 1726855292.64089: worker is 1 (out of 1 available) 30582 1726855292.64101: exiting _queue_task() for managed_node3/debug 30582 1726855292.64113: done queuing things up, now waiting for results queue to drain 30582 1726855292.64115: waiting for pending results... 30582 1726855292.64481: running TaskExecutor() for managed_node3/TASK: TEST: I can activate an existing profile 30582 1726855292.64581: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a49 30582 1726855292.64625: variable 'ansible_search_path' from source: unknown 30582 1726855292.64629: variable 'ansible_search_path' from source: unknown 30582 1726855292.64667: calling self._execute() 30582 1726855292.64759: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.64786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.64793: variable 'omit' from source: magic vars 30582 1726855292.65276: variable 'ansible_distribution_major_version' from source: facts 30582 1726855292.65281: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855292.65283: variable 'omit' from source: magic vars 30582 1726855292.65286: variable 'omit' from source: magic vars 30582 1726855292.65345: variable 'lsr_description' from source: include params 30582 1726855292.65382: variable 'omit' from source: magic vars 30582 1726855292.65444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855292.65481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855292.65511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855292.65539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.65556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.65595: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855292.65600: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.65610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.65758: Set connection var ansible_timeout to 10 30582 1726855292.65762: Set connection var ansible_connection to ssh 30582 1726855292.65768: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855292.65776: Set connection var ansible_pipelining to False 30582 1726855292.65779: Set connection var ansible_shell_executable to /bin/sh 30582 1726855292.65782: Set connection var ansible_shell_type to sh 30582 1726855292.65808: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.65812: variable 'ansible_connection' from source: unknown 30582 1726855292.65814: variable 'ansible_module_compression' from source: unknown 30582 1726855292.65817: variable 'ansible_shell_type' from source: unknown 30582 1726855292.65943: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.65946: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.65949: variable 'ansible_pipelining' from source: unknown 30582 1726855292.65951: variable 'ansible_timeout' from source: unknown 30582 1726855292.65953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.66064: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855292.66086: variable 'omit' from source: magic vars 30582 1726855292.66161: starting attempt loop 30582 1726855292.66165: running the handler 30582 1726855292.66167: handler run complete 30582 1726855292.66190: attempt loop complete, returning result 30582 1726855292.66198: _execute() done 30582 1726855292.66204: dumping result to json 30582 1726855292.66212: done dumping result, returning 30582 1726855292.66222: done running TaskExecutor() for managed_node3/TASK: TEST: I can activate an existing profile [0affcc66-ac2b-aa83-7d57-000000000a49] 30582 1726855292.66231: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a49 30582 1726855292.66558: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a49 30582 1726855292.66562: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I can activate an existing profile ########## 30582 1726855292.66626: no more pending results, returning what we have 30582 1726855292.66631: results queue empty 30582 1726855292.66633: checking for any_errors_fatal 30582 1726855292.66634: done checking for any_errors_fatal 30582 1726855292.66635: checking for max_fail_percentage 30582 1726855292.66637: done checking for max_fail_percentage 30582 1726855292.66638: checking to see if all hosts have failed and the running result is not ok 30582 1726855292.66639: done checking to see if all hosts have failed 30582 1726855292.66640: getting the remaining hosts for this loop 30582 1726855292.66642: done getting the remaining hosts for this loop 30582 1726855292.66646: getting the next task for host managed_node3 30582 1726855292.66655: done getting next task for host managed_node3 30582 1726855292.66659: ^ task is: TASK: Show item 30582 1726855292.66663: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855292.66668: getting variables 30582 1726855292.66669: in VariableManager get_vars() 30582 1726855292.66730: Calling all_inventory to load vars for managed_node3 30582 1726855292.66734: Calling groups_inventory to load vars for managed_node3 30582 1726855292.66738: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855292.66750: Calling all_plugins_play to load vars for managed_node3 30582 1726855292.66754: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855292.66757: Calling groups_plugins_play to load vars for managed_node3 30582 1726855292.69044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855292.71757: done with get_vars() 30582 1726855292.71794: done getting variables 30582 1726855292.72085: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 14:01:32 -0400 (0:00:00.083) 0:00:29.071 ****** 30582 1726855292.72121: entering _queue_task() for managed_node3/debug 30582 1726855292.72940: worker is 1 (out of 1 available) 30582 1726855292.72952: exiting _queue_task() for managed_node3/debug 30582 1726855292.73108: done queuing things up, now waiting for results queue to drain 30582 1726855292.73111: waiting for pending results... 30582 1726855292.73901: running TaskExecutor() for managed_node3/TASK: Show item 30582 1726855292.74297: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a4a 30582 1726855292.74302: variable 'ansible_search_path' from source: unknown 30582 1726855292.74305: variable 'ansible_search_path' from source: unknown 30582 1726855292.74307: variable 'omit' from source: magic vars 30582 1726855292.74936: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.74944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.74957: variable 'omit' from source: magic vars 30582 1726855292.75918: variable 'ansible_distribution_major_version' from source: facts 30582 1726855292.75928: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855292.75934: variable 'omit' from source: magic vars 30582 1726855292.76201: variable 'omit' from source: magic vars 30582 1726855292.76247: variable 'item' from source: unknown 30582 1726855292.76555: variable 'item' from source: unknown 30582 1726855292.76579: variable 'omit' from source: magic vars 30582 1726855292.76731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855292.76765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855292.76786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855292.76939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.76951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.76991: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855292.76995: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.76997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.77282: Set connection var ansible_timeout to 10 30582 1726855292.77285: Set connection var ansible_connection to ssh 30582 1726855292.77293: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855292.77299: Set connection var ansible_pipelining to False 30582 1726855292.77308: Set connection var ansible_shell_executable to /bin/sh 30582 1726855292.77311: Set connection var ansible_shell_type to sh 30582 1726855292.77328: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.77331: variable 'ansible_connection' from source: unknown 30582 1726855292.77334: variable 'ansible_module_compression' from source: unknown 30582 1726855292.77336: variable 'ansible_shell_type' from source: unknown 30582 1726855292.77338: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.77342: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.77347: variable 'ansible_pipelining' from source: unknown 30582 1726855292.77350: variable 'ansible_timeout' from source: unknown 30582 1726855292.77527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.77761: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855292.77781: variable 'omit' from source: magic vars 30582 1726855292.77897: starting attempt loop 30582 1726855292.77901: running the handler 30582 1726855292.77952: variable 'lsr_description' from source: include params 30582 1726855292.78121: variable 'lsr_description' from source: include params 30582 1726855292.78133: handler run complete 30582 1726855292.78221: attempt loop complete, returning result 30582 1726855292.78292: variable 'item' from source: unknown 30582 1726855292.78306: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can activate an existing profile" } 30582 1726855292.78628: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.78631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.78633: variable 'omit' from source: magic vars 30582 1726855292.79405: variable 'ansible_distribution_major_version' from source: facts 30582 1726855292.79408: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855292.79410: variable 'omit' from source: magic vars 30582 1726855292.79413: variable 'omit' from source: magic vars 30582 1726855292.79414: variable 'item' from source: unknown 30582 1726855292.79841: variable 'item' from source: unknown 30582 1726855292.79856: variable 'omit' from source: magic vars 30582 1726855292.79879: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855292.79889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.79901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.79916: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855292.79919: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.79921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.80095: Set connection var ansible_timeout to 10 30582 1726855292.80232: Set connection var ansible_connection to ssh 30582 1726855292.80236: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855292.80242: Set connection var ansible_pipelining to False 30582 1726855292.80247: Set connection var ansible_shell_executable to /bin/sh 30582 1726855292.80250: Set connection var ansible_shell_type to sh 30582 1726855292.80271: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.80276: variable 'ansible_connection' from source: unknown 30582 1726855292.80280: variable 'ansible_module_compression' from source: unknown 30582 1726855292.80282: variable 'ansible_shell_type' from source: unknown 30582 1726855292.80284: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.80286: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.80290: variable 'ansible_pipelining' from source: unknown 30582 1726855292.80292: variable 'ansible_timeout' from source: unknown 30582 1726855292.80492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.80560: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855292.80563: variable 'omit' from source: magic vars 30582 1726855292.80746: starting attempt loop 30582 1726855292.80749: running the handler 30582 1726855292.80751: variable 'lsr_setup' from source: include params 30582 1726855292.80779: variable 'lsr_setup' from source: include params 30582 1726855292.80922: handler run complete 30582 1726855292.80936: attempt loop complete, returning result 30582 1726855292.80952: variable 'item' from source: unknown 30582 1726855292.81126: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml" ] } 30582 1726855292.81448: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.81451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.81454: variable 'omit' from source: magic vars 30582 1726855292.81590: variable 'ansible_distribution_major_version' from source: facts 30582 1726855292.81864: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855292.81867: variable 'omit' from source: magic vars 30582 1726855292.81870: variable 'omit' from source: magic vars 30582 1726855292.81872: variable 'item' from source: unknown 30582 1726855292.81883: variable 'item' from source: unknown 30582 1726855292.81885: variable 'omit' from source: magic vars 30582 1726855292.82306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855292.82317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.82320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.82332: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855292.82334: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.82337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.82424: Set connection var ansible_timeout to 10 30582 1726855292.82427: Set connection var ansible_connection to ssh 30582 1726855292.82432: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855292.82437: Set connection var ansible_pipelining to False 30582 1726855292.82442: Set connection var ansible_shell_executable to /bin/sh 30582 1726855292.82444: Set connection var ansible_shell_type to sh 30582 1726855292.82465: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.82468: variable 'ansible_connection' from source: unknown 30582 1726855292.82471: variable 'ansible_module_compression' from source: unknown 30582 1726855292.82475: variable 'ansible_shell_type' from source: unknown 30582 1726855292.82478: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.82481: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.82483: variable 'ansible_pipelining' from source: unknown 30582 1726855292.82485: variable 'ansible_timeout' from source: unknown 30582 1726855292.82488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.82981: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855292.82989: variable 'omit' from source: magic vars 30582 1726855292.82993: starting attempt loop 30582 1726855292.82996: running the handler 30582 1726855292.83026: variable 'lsr_test' from source: include params 30582 1726855292.83091: variable 'lsr_test' from source: include params 30582 1726855292.83526: handler run complete 30582 1726855292.83540: attempt loop complete, returning result 30582 1726855292.83556: variable 'item' from source: unknown 30582 1726855292.83622: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/activate_profile.yml" ] } 30582 1726855292.84112: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.84115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.84118: variable 'omit' from source: magic vars 30582 1726855292.84329: variable 'ansible_distribution_major_version' from source: facts 30582 1726855292.84332: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855292.84334: variable 'omit' from source: magic vars 30582 1726855292.84611: variable 'omit' from source: magic vars 30582 1726855292.84654: variable 'item' from source: unknown 30582 1726855292.84714: variable 'item' from source: unknown 30582 1726855292.84730: variable 'omit' from source: magic vars 30582 1726855292.84750: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855292.84761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.84768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.84778: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855292.84781: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.84783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.85367: Set connection var ansible_timeout to 10 30582 1726855292.85375: Set connection var ansible_connection to ssh 30582 1726855292.85378: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855292.85381: Set connection var ansible_pipelining to False 30582 1726855292.85384: Set connection var ansible_shell_executable to /bin/sh 30582 1726855292.85386: Set connection var ansible_shell_type to sh 30582 1726855292.85414: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.85417: variable 'ansible_connection' from source: unknown 30582 1726855292.85420: variable 'ansible_module_compression' from source: unknown 30582 1726855292.85422: variable 'ansible_shell_type' from source: unknown 30582 1726855292.85424: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.85426: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.85428: variable 'ansible_pipelining' from source: unknown 30582 1726855292.85430: variable 'ansible_timeout' from source: unknown 30582 1726855292.85431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.85992: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855292.85996: variable 'omit' from source: magic vars 30582 1726855292.85998: starting attempt loop 30582 1726855292.86001: running the handler 30582 1726855292.86003: variable 'lsr_assert' from source: include params 30582 1726855292.86038: variable 'lsr_assert' from source: include params 30582 1726855292.86056: handler run complete 30582 1726855292.86068: attempt loop complete, returning result 30582 1726855292.86131: variable 'item' from source: unknown 30582 1726855292.86568: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_present.yml" ] } 30582 1726855292.86665: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.86670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.86675: variable 'omit' from source: magic vars 30582 1726855292.88477: variable 'ansible_distribution_major_version' from source: facts 30582 1726855292.88481: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855292.88490: variable 'omit' from source: magic vars 30582 1726855292.88493: variable 'omit' from source: magic vars 30582 1726855292.88495: variable 'item' from source: unknown 30582 1726855292.89394: variable 'item' from source: unknown 30582 1726855292.89397: variable 'omit' from source: magic vars 30582 1726855292.89400: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855292.89402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.89403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.89405: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855292.89407: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.89408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.89419: Set connection var ansible_timeout to 10 30582 1726855292.89424: Set connection var ansible_connection to ssh 30582 1726855292.89434: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855292.89441: Set connection var ansible_pipelining to False 30582 1726855292.89448: Set connection var ansible_shell_executable to /bin/sh 30582 1726855292.89459: Set connection var ansible_shell_type to sh 30582 1726855292.89486: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.89792: variable 'ansible_connection' from source: unknown 30582 1726855292.89796: variable 'ansible_module_compression' from source: unknown 30582 1726855292.89798: variable 'ansible_shell_type' from source: unknown 30582 1726855292.89800: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.89802: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.89803: variable 'ansible_pipelining' from source: unknown 30582 1726855292.89805: variable 'ansible_timeout' from source: unknown 30582 1726855292.89807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.90192: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855292.90196: variable 'omit' from source: magic vars 30582 1726855292.90198: starting attempt loop 30582 1726855292.90200: running the handler 30582 1726855292.90202: handler run complete 30582 1726855292.90204: attempt loop complete, returning result 30582 1726855292.90206: variable 'item' from source: unknown 30582 1726855292.90643: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30582 1726855292.90815: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.91293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.91296: variable 'omit' from source: magic vars 30582 1726855292.91569: variable 'ansible_distribution_major_version' from source: facts 30582 1726855292.91585: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855292.91597: variable 'omit' from source: magic vars 30582 1726855292.91616: variable 'omit' from source: magic vars 30582 1726855292.91661: variable 'item' from source: unknown 30582 1726855292.91957: variable 'item' from source: unknown 30582 1726855292.92293: variable 'omit' from source: magic vars 30582 1726855292.92297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855292.92299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.92301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.92303: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855292.92305: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.92307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.92308: Set connection var ansible_timeout to 10 30582 1726855292.92310: Set connection var ansible_connection to ssh 30582 1726855292.92339: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855292.92349: Set connection var ansible_pipelining to False 30582 1726855292.92359: Set connection var ansible_shell_executable to /bin/sh 30582 1726855292.92367: Set connection var ansible_shell_type to sh 30582 1726855292.92398: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.92406: variable 'ansible_connection' from source: unknown 30582 1726855292.92413: variable 'ansible_module_compression' from source: unknown 30582 1726855292.92420: variable 'ansible_shell_type' from source: unknown 30582 1726855292.92426: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.92433: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.92440: variable 'ansible_pipelining' from source: unknown 30582 1726855292.92446: variable 'ansible_timeout' from source: unknown 30582 1726855292.92453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.92552: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855292.92566: variable 'omit' from source: magic vars 30582 1726855292.92578: starting attempt loop 30582 1726855292.92585: running the handler 30582 1726855292.92612: variable 'lsr_fail_debug' from source: play vars 30582 1726855292.92694: variable 'lsr_fail_debug' from source: play vars 30582 1726855292.92993: handler run complete 30582 1726855292.92996: attempt loop complete, returning result 30582 1726855292.92998: variable 'item' from source: unknown 30582 1726855292.93018: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30582 1726855292.93253: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.93307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.93320: variable 'omit' from source: magic vars 30582 1726855292.93586: variable 'ansible_distribution_major_version' from source: facts 30582 1726855292.93892: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855292.93896: variable 'omit' from source: magic vars 30582 1726855292.93898: variable 'omit' from source: magic vars 30582 1726855292.93900: variable 'item' from source: unknown 30582 1726855292.93946: variable 'item' from source: unknown 30582 1726855292.93966: variable 'omit' from source: magic vars 30582 1726855292.94015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855292.94061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.94072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855292.94099: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855292.94155: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.94165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.94263: Set connection var ansible_timeout to 10 30582 1726855292.94298: Set connection var ansible_connection to ssh 30582 1726855292.94311: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855292.94320: Set connection var ansible_pipelining to False 30582 1726855292.94327: Set connection var ansible_shell_executable to /bin/sh 30582 1726855292.94333: Set connection var ansible_shell_type to sh 30582 1726855292.94354: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.94360: variable 'ansible_connection' from source: unknown 30582 1726855292.94366: variable 'ansible_module_compression' from source: unknown 30582 1726855292.94376: variable 'ansible_shell_type' from source: unknown 30582 1726855292.94491: variable 'ansible_shell_executable' from source: unknown 30582 1726855292.94495: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855292.94497: variable 'ansible_pipelining' from source: unknown 30582 1726855292.94499: variable 'ansible_timeout' from source: unknown 30582 1726855292.94501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855292.94520: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855292.94534: variable 'omit' from source: magic vars 30582 1726855292.94542: starting attempt loop 30582 1726855292.94548: running the handler 30582 1726855292.94571: variable 'lsr_cleanup' from source: include params 30582 1726855292.94640: variable 'lsr_cleanup' from source: include params 30582 1726855292.94662: handler run complete 30582 1726855292.94684: attempt loop complete, returning result 30582 1726855292.94705: variable 'item' from source: unknown 30582 1726855292.94764: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30582 1726855292.95092: dumping result to json 30582 1726855292.95095: done dumping result, returning 30582 1726855292.95097: done running TaskExecutor() for managed_node3/TASK: Show item [0affcc66-ac2b-aa83-7d57-000000000a4a] 30582 1726855292.95100: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4a 30582 1726855292.95143: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4a 30582 1726855292.95147: WORKER PROCESS EXITING 30582 1726855292.95236: no more pending results, returning what we have 30582 1726855292.95241: results queue empty 30582 1726855292.95242: checking for any_errors_fatal 30582 1726855292.95247: done checking for any_errors_fatal 30582 1726855292.95247: checking for max_fail_percentage 30582 1726855292.95249: done checking for max_fail_percentage 30582 1726855292.95250: checking to see if all hosts have failed and the running result is not ok 30582 1726855292.95250: done checking to see if all hosts have failed 30582 1726855292.95251: getting the remaining hosts for this loop 30582 1726855292.95252: done getting the remaining hosts for this loop 30582 1726855292.95255: getting the next task for host managed_node3 30582 1726855292.95261: done getting next task for host managed_node3 30582 1726855292.95263: ^ task is: TASK: Include the task 'show_interfaces.yml' 30582 1726855292.95267: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855292.95270: getting variables 30582 1726855292.95272: in VariableManager get_vars() 30582 1726855292.95299: Calling all_inventory to load vars for managed_node3 30582 1726855292.95302: Calling groups_inventory to load vars for managed_node3 30582 1726855292.95305: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855292.95314: Calling all_plugins_play to load vars for managed_node3 30582 1726855292.95317: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855292.95319: Calling groups_plugins_play to load vars for managed_node3 30582 1726855292.98765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.00834: done with get_vars() 30582 1726855293.00865: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 14:01:33 -0400 (0:00:00.288) 0:00:29.359 ****** 30582 1726855293.00976: entering _queue_task() for managed_node3/include_tasks 30582 1726855293.01464: worker is 1 (out of 1 available) 30582 1726855293.01480: exiting _queue_task() for managed_node3/include_tasks 30582 1726855293.01639: done queuing things up, now waiting for results queue to drain 30582 1726855293.01642: waiting for pending results... 30582 1726855293.02150: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30582 1726855293.02495: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a4b 30582 1726855293.02500: variable 'ansible_search_path' from source: unknown 30582 1726855293.02503: variable 'ansible_search_path' from source: unknown 30582 1726855293.02506: calling self._execute() 30582 1726855293.02638: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.02641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.02654: variable 'omit' from source: magic vars 30582 1726855293.03306: variable 'ansible_distribution_major_version' from source: facts 30582 1726855293.03319: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855293.03325: _execute() done 30582 1726855293.03329: dumping result to json 30582 1726855293.03331: done dumping result, returning 30582 1726855293.03339: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcc66-ac2b-aa83-7d57-000000000a4b] 30582 1726855293.03344: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4b 30582 1726855293.03449: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4b 30582 1726855293.03453: WORKER PROCESS EXITING 30582 1726855293.03486: no more pending results, returning what we have 30582 1726855293.03505: in VariableManager get_vars() 30582 1726855293.03555: Calling all_inventory to load vars for managed_node3 30582 1726855293.03558: Calling groups_inventory to load vars for managed_node3 30582 1726855293.03562: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.03578: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.03582: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.03585: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.05456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.07010: done with get_vars() 30582 1726855293.07039: variable 'ansible_search_path' from source: unknown 30582 1726855293.07041: variable 'ansible_search_path' from source: unknown 30582 1726855293.07089: we have included files to process 30582 1726855293.07090: generating all_blocks data 30582 1726855293.07092: done generating all_blocks data 30582 1726855293.07097: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855293.07098: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855293.07101: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855293.07212: in VariableManager get_vars() 30582 1726855293.07233: done with get_vars() 30582 1726855293.07356: done processing included file 30582 1726855293.07358: iterating over new_blocks loaded from include file 30582 1726855293.07359: in VariableManager get_vars() 30582 1726855293.07377: done with get_vars() 30582 1726855293.07379: filtering new block on tags 30582 1726855293.07418: done filtering new block on tags 30582 1726855293.07421: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30582 1726855293.07426: extending task lists for all hosts with included blocks 30582 1726855293.08302: done extending task lists 30582 1726855293.08304: done processing included files 30582 1726855293.08305: results queue empty 30582 1726855293.08305: checking for any_errors_fatal 30582 1726855293.08312: done checking for any_errors_fatal 30582 1726855293.08313: checking for max_fail_percentage 30582 1726855293.08314: done checking for max_fail_percentage 30582 1726855293.08315: checking to see if all hosts have failed and the running result is not ok 30582 1726855293.08316: done checking to see if all hosts have failed 30582 1726855293.08317: getting the remaining hosts for this loop 30582 1726855293.08318: done getting the remaining hosts for this loop 30582 1726855293.08321: getting the next task for host managed_node3 30582 1726855293.08325: done getting next task for host managed_node3 30582 1726855293.08328: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30582 1726855293.08331: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855293.08334: getting variables 30582 1726855293.08335: in VariableManager get_vars() 30582 1726855293.08347: Calling all_inventory to load vars for managed_node3 30582 1726855293.08349: Calling groups_inventory to load vars for managed_node3 30582 1726855293.08352: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.08357: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.08359: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.08361: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.09803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.11455: done with get_vars() 30582 1726855293.11491: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 14:01:33 -0400 (0:00:00.105) 0:00:29.465 ****** 30582 1726855293.11561: entering _queue_task() for managed_node3/include_tasks 30582 1726855293.12327: worker is 1 (out of 1 available) 30582 1726855293.12342: exiting _queue_task() for managed_node3/include_tasks 30582 1726855293.12354: done queuing things up, now waiting for results queue to drain 30582 1726855293.12355: waiting for pending results... 30582 1726855293.12982: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30582 1726855293.12990: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a72 30582 1726855293.13100: variable 'ansible_search_path' from source: unknown 30582 1726855293.13111: variable 'ansible_search_path' from source: unknown 30582 1726855293.13164: calling self._execute() 30582 1726855293.13313: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.13326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.13342: variable 'omit' from source: magic vars 30582 1726855293.14497: variable 'ansible_distribution_major_version' from source: facts 30582 1726855293.14501: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855293.14504: _execute() done 30582 1726855293.14506: dumping result to json 30582 1726855293.14508: done dumping result, returning 30582 1726855293.14511: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcc66-ac2b-aa83-7d57-000000000a72] 30582 1726855293.14512: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a72 30582 1726855293.14581: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a72 30582 1726855293.14584: WORKER PROCESS EXITING 30582 1726855293.14631: no more pending results, returning what we have 30582 1726855293.14637: in VariableManager get_vars() 30582 1726855293.14683: Calling all_inventory to load vars for managed_node3 30582 1726855293.14686: Calling groups_inventory to load vars for managed_node3 30582 1726855293.14692: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.14714: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.14719: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.14723: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.16541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.19098: done with get_vars() 30582 1726855293.19132: variable 'ansible_search_path' from source: unknown 30582 1726855293.19134: variable 'ansible_search_path' from source: unknown 30582 1726855293.19196: we have included files to process 30582 1726855293.19198: generating all_blocks data 30582 1726855293.19200: done generating all_blocks data 30582 1726855293.19201: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855293.19202: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855293.19205: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855293.19496: done processing included file 30582 1726855293.19498: iterating over new_blocks loaded from include file 30582 1726855293.19500: in VariableManager get_vars() 30582 1726855293.19517: done with get_vars() 30582 1726855293.19519: filtering new block on tags 30582 1726855293.19556: done filtering new block on tags 30582 1726855293.19558: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30582 1726855293.19564: extending task lists for all hosts with included blocks 30582 1726855293.19735: done extending task lists 30582 1726855293.19736: done processing included files 30582 1726855293.19737: results queue empty 30582 1726855293.19737: checking for any_errors_fatal 30582 1726855293.19742: done checking for any_errors_fatal 30582 1726855293.19742: checking for max_fail_percentage 30582 1726855293.19744: done checking for max_fail_percentage 30582 1726855293.19745: checking to see if all hosts have failed and the running result is not ok 30582 1726855293.19745: done checking to see if all hosts have failed 30582 1726855293.19746: getting the remaining hosts for this loop 30582 1726855293.19747: done getting the remaining hosts for this loop 30582 1726855293.19750: getting the next task for host managed_node3 30582 1726855293.19754: done getting next task for host managed_node3 30582 1726855293.19756: ^ task is: TASK: Gather current interface info 30582 1726855293.19759: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855293.19762: getting variables 30582 1726855293.19763: in VariableManager get_vars() 30582 1726855293.19776: Calling all_inventory to load vars for managed_node3 30582 1726855293.19778: Calling groups_inventory to load vars for managed_node3 30582 1726855293.19781: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.19786: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.19789: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.19793: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.20984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.22520: done with get_vars() 30582 1726855293.22553: done getting variables 30582 1726855293.22607: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 14:01:33 -0400 (0:00:00.110) 0:00:29.576 ****** 30582 1726855293.22643: entering _queue_task() for managed_node3/command 30582 1726855293.23026: worker is 1 (out of 1 available) 30582 1726855293.23038: exiting _queue_task() for managed_node3/command 30582 1726855293.23049: done queuing things up, now waiting for results queue to drain 30582 1726855293.23051: waiting for pending results... 30582 1726855293.23416: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30582 1726855293.23489: in run() - task 0affcc66-ac2b-aa83-7d57-000000000aad 30582 1726855293.23692: variable 'ansible_search_path' from source: unknown 30582 1726855293.23697: variable 'ansible_search_path' from source: unknown 30582 1726855293.23700: calling self._execute() 30582 1726855293.23703: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.23705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.23708: variable 'omit' from source: magic vars 30582 1726855293.24059: variable 'ansible_distribution_major_version' from source: facts 30582 1726855293.24079: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855293.24093: variable 'omit' from source: magic vars 30582 1726855293.24149: variable 'omit' from source: magic vars 30582 1726855293.24192: variable 'omit' from source: magic vars 30582 1726855293.24237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855293.24286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855293.24314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855293.24335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855293.24351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855293.24394: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855293.24404: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.24412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.24524: Set connection var ansible_timeout to 10 30582 1726855293.24532: Set connection var ansible_connection to ssh 30582 1726855293.24544: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855293.24552: Set connection var ansible_pipelining to False 30582 1726855293.24561: Set connection var ansible_shell_executable to /bin/sh 30582 1726855293.24567: Set connection var ansible_shell_type to sh 30582 1726855293.24603: variable 'ansible_shell_executable' from source: unknown 30582 1726855293.24610: variable 'ansible_connection' from source: unknown 30582 1726855293.24618: variable 'ansible_module_compression' from source: unknown 30582 1726855293.24625: variable 'ansible_shell_type' from source: unknown 30582 1726855293.24632: variable 'ansible_shell_executable' from source: unknown 30582 1726855293.24693: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.24696: variable 'ansible_pipelining' from source: unknown 30582 1726855293.24699: variable 'ansible_timeout' from source: unknown 30582 1726855293.24701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.24810: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855293.24828: variable 'omit' from source: magic vars 30582 1726855293.24838: starting attempt loop 30582 1726855293.24846: running the handler 30582 1726855293.24867: _low_level_execute_command(): starting 30582 1726855293.24882: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855293.25641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855293.25657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855293.25789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855293.25795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855293.25806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855293.25825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855293.25923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855293.27619: stdout chunk (state=3): >>>/root <<< 30582 1726855293.27755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855293.27760: stdout chunk (state=3): >>><<< 30582 1726855293.27893: stderr chunk (state=3): >>><<< 30582 1726855293.27898: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855293.27902: _low_level_execute_command(): starting 30582 1726855293.27909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241 `" && echo ansible-tmp-1726855293.27805-31973-238545062199241="` echo /root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241 `" ) && sleep 0' 30582 1726855293.28468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855293.28478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855293.28490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855293.28510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855293.28522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855293.28533: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855293.28536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855293.28631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855293.28647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855293.28748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855293.30664: stdout chunk (state=3): >>>ansible-tmp-1726855293.27805-31973-238545062199241=/root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241 <<< 30582 1726855293.30832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855293.30863: stderr chunk (state=3): >>><<< 30582 1726855293.30866: stdout chunk (state=3): >>><<< 30582 1726855293.31093: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855293.27805-31973-238545062199241=/root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855293.31097: variable 'ansible_module_compression' from source: unknown 30582 1726855293.31099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855293.31101: variable 'ansible_facts' from source: unknown 30582 1726855293.31131: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/AnsiballZ_command.py 30582 1726855293.31358: Sending initial data 30582 1726855293.31361: Sent initial data (154 bytes) 30582 1726855293.31980: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855293.32112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855293.32133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855293.32230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855293.33786: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855293.34121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855293.34125: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp4ohjq7pj /root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/AnsiballZ_command.py <<< 30582 1726855293.34127: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/AnsiballZ_command.py" <<< 30582 1726855293.34194: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp4ohjq7pj" to remote "/root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/AnsiballZ_command.py" <<< 30582 1726855293.35039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855293.35073: stderr chunk (state=3): >>><<< 30582 1726855293.35096: stdout chunk (state=3): >>><<< 30582 1726855293.35204: done transferring module to remote 30582 1726855293.35207: _low_level_execute_command(): starting 30582 1726855293.35210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/ /root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/AnsiballZ_command.py && sleep 0' 30582 1726855293.35874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855293.35892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855293.35906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855293.35925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855293.36036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855293.36074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855293.36174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855293.38041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855293.38050: stdout chunk (state=3): >>><<< 30582 1726855293.38225: stderr chunk (state=3): >>><<< 30582 1726855293.38230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855293.38232: _low_level_execute_command(): starting 30582 1726855293.38235: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/AnsiballZ_command.py && sleep 0' 30582 1726855293.38980: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855293.39043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855293.39067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855293.39164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855293.54766: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:01:33.542253", "end": "2024-09-20 14:01:33.545515", "delta": "0:00:00.003262", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855293.56279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855293.56284: stdout chunk (state=3): >>><<< 30582 1726855293.56290: stderr chunk (state=3): >>><<< 30582 1726855293.56643: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:01:33.542253", "end": "2024-09-20 14:01:33.545515", "delta": "0:00:00.003262", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855293.56648: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855293.56652: _low_level_execute_command(): starting 30582 1726855293.56654: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855293.27805-31973-238545062199241/ > /dev/null 2>&1 && sleep 0' 30582 1726855293.57781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855293.57951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855293.58032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855293.58051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855293.58113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855293.58209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855293.60166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855293.60170: stdout chunk (state=3): >>><<< 30582 1726855293.60293: stderr chunk (state=3): >>><<< 30582 1726855293.60297: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855293.60300: handler run complete 30582 1726855293.60302: Evaluated conditional (False): False 30582 1726855293.60304: attempt loop complete, returning result 30582 1726855293.60306: _execute() done 30582 1726855293.60308: dumping result to json 30582 1726855293.60310: done dumping result, returning 30582 1726855293.60311: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcc66-ac2b-aa83-7d57-000000000aad] 30582 1726855293.60313: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000aad 30582 1726855293.60421: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000aad 30582 1726855293.60425: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003262", "end": "2024-09-20 14:01:33.545515", "rc": 0, "start": "2024-09-20 14:01:33.542253" } STDOUT: bonding_masters eth0 lo rpltstbr 30582 1726855293.60568: no more pending results, returning what we have 30582 1726855293.60572: results queue empty 30582 1726855293.60573: checking for any_errors_fatal 30582 1726855293.60575: done checking for any_errors_fatal 30582 1726855293.60576: checking for max_fail_percentage 30582 1726855293.60578: done checking for max_fail_percentage 30582 1726855293.60579: checking to see if all hosts have failed and the running result is not ok 30582 1726855293.60580: done checking to see if all hosts have failed 30582 1726855293.60580: getting the remaining hosts for this loop 30582 1726855293.60582: done getting the remaining hosts for this loop 30582 1726855293.60586: getting the next task for host managed_node3 30582 1726855293.60597: done getting next task for host managed_node3 30582 1726855293.60603: ^ task is: TASK: Set current_interfaces 30582 1726855293.60608: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855293.60613: getting variables 30582 1726855293.60615: in VariableManager get_vars() 30582 1726855293.60647: Calling all_inventory to load vars for managed_node3 30582 1726855293.60650: Calling groups_inventory to load vars for managed_node3 30582 1726855293.60653: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.60666: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.60670: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.60672: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.69022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.70068: done with get_vars() 30582 1726855293.70097: done getting variables 30582 1726855293.70151: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 14:01:33 -0400 (0:00:00.475) 0:00:30.051 ****** 30582 1726855293.70184: entering _queue_task() for managed_node3/set_fact 30582 1726855293.70535: worker is 1 (out of 1 available) 30582 1726855293.70548: exiting _queue_task() for managed_node3/set_fact 30582 1726855293.70559: done queuing things up, now waiting for results queue to drain 30582 1726855293.70561: waiting for pending results... 30582 1726855293.70880: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30582 1726855293.70982: in run() - task 0affcc66-ac2b-aa83-7d57-000000000aae 30582 1726855293.70995: variable 'ansible_search_path' from source: unknown 30582 1726855293.71001: variable 'ansible_search_path' from source: unknown 30582 1726855293.71029: calling self._execute() 30582 1726855293.71104: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.71108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.71118: variable 'omit' from source: magic vars 30582 1726855293.71507: variable 'ansible_distribution_major_version' from source: facts 30582 1726855293.71511: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855293.71514: variable 'omit' from source: magic vars 30582 1726855293.71517: variable 'omit' from source: magic vars 30582 1726855293.71602: variable '_current_interfaces' from source: set_fact 30582 1726855293.71664: variable 'omit' from source: magic vars 30582 1726855293.71705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855293.71739: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855293.71760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855293.71836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855293.71840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855293.71843: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855293.71846: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.71848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.71918: Set connection var ansible_timeout to 10 30582 1726855293.71922: Set connection var ansible_connection to ssh 30582 1726855293.71928: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855293.71942: Set connection var ansible_pipelining to False 30582 1726855293.71945: Set connection var ansible_shell_executable to /bin/sh 30582 1726855293.71948: Set connection var ansible_shell_type to sh 30582 1726855293.71959: variable 'ansible_shell_executable' from source: unknown 30582 1726855293.71962: variable 'ansible_connection' from source: unknown 30582 1726855293.71964: variable 'ansible_module_compression' from source: unknown 30582 1726855293.71967: variable 'ansible_shell_type' from source: unknown 30582 1726855293.71970: variable 'ansible_shell_executable' from source: unknown 30582 1726855293.71972: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.71977: variable 'ansible_pipelining' from source: unknown 30582 1726855293.71979: variable 'ansible_timeout' from source: unknown 30582 1726855293.71981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.72083: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855293.72093: variable 'omit' from source: magic vars 30582 1726855293.72098: starting attempt loop 30582 1726855293.72101: running the handler 30582 1726855293.72111: handler run complete 30582 1726855293.72118: attempt loop complete, returning result 30582 1726855293.72121: _execute() done 30582 1726855293.72123: dumping result to json 30582 1726855293.72126: done dumping result, returning 30582 1726855293.72133: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcc66-ac2b-aa83-7d57-000000000aae] 30582 1726855293.72137: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000aae 30582 1726855293.72220: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000aae 30582 1726855293.72222: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30582 1726855293.72286: no more pending results, returning what we have 30582 1726855293.72292: results queue empty 30582 1726855293.72293: checking for any_errors_fatal 30582 1726855293.72303: done checking for any_errors_fatal 30582 1726855293.72304: checking for max_fail_percentage 30582 1726855293.72306: done checking for max_fail_percentage 30582 1726855293.72306: checking to see if all hosts have failed and the running result is not ok 30582 1726855293.72307: done checking to see if all hosts have failed 30582 1726855293.72308: getting the remaining hosts for this loop 30582 1726855293.72309: done getting the remaining hosts for this loop 30582 1726855293.72313: getting the next task for host managed_node3 30582 1726855293.72323: done getting next task for host managed_node3 30582 1726855293.72325: ^ task is: TASK: Show current_interfaces 30582 1726855293.72329: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855293.72333: getting variables 30582 1726855293.72334: in VariableManager get_vars() 30582 1726855293.72367: Calling all_inventory to load vars for managed_node3 30582 1726855293.72379: Calling groups_inventory to load vars for managed_node3 30582 1726855293.72383: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.72396: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.72399: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.72401: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.73192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.74176: done with get_vars() 30582 1726855293.74194: done getting variables 30582 1726855293.74239: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 14:01:33 -0400 (0:00:00.040) 0:00:30.092 ****** 30582 1726855293.74264: entering _queue_task() for managed_node3/debug 30582 1726855293.74529: worker is 1 (out of 1 available) 30582 1726855293.74543: exiting _queue_task() for managed_node3/debug 30582 1726855293.74554: done queuing things up, now waiting for results queue to drain 30582 1726855293.74556: waiting for pending results... 30582 1726855293.74737: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30582 1726855293.74820: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a73 30582 1726855293.74833: variable 'ansible_search_path' from source: unknown 30582 1726855293.74837: variable 'ansible_search_path' from source: unknown 30582 1726855293.74864: calling self._execute() 30582 1726855293.74933: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.74937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.74946: variable 'omit' from source: magic vars 30582 1726855293.75225: variable 'ansible_distribution_major_version' from source: facts 30582 1726855293.75235: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855293.75241: variable 'omit' from source: magic vars 30582 1726855293.75271: variable 'omit' from source: magic vars 30582 1726855293.75341: variable 'current_interfaces' from source: set_fact 30582 1726855293.75362: variable 'omit' from source: magic vars 30582 1726855293.75396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855293.75422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855293.75441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855293.75453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855293.75464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855293.75491: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855293.75494: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.75496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.75567: Set connection var ansible_timeout to 10 30582 1726855293.75570: Set connection var ansible_connection to ssh 30582 1726855293.75578: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855293.75580: Set connection var ansible_pipelining to False 30582 1726855293.75585: Set connection var ansible_shell_executable to /bin/sh 30582 1726855293.75589: Set connection var ansible_shell_type to sh 30582 1726855293.75606: variable 'ansible_shell_executable' from source: unknown 30582 1726855293.75610: variable 'ansible_connection' from source: unknown 30582 1726855293.75612: variable 'ansible_module_compression' from source: unknown 30582 1726855293.75615: variable 'ansible_shell_type' from source: unknown 30582 1726855293.75617: variable 'ansible_shell_executable' from source: unknown 30582 1726855293.75619: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.75622: variable 'ansible_pipelining' from source: unknown 30582 1726855293.75625: variable 'ansible_timeout' from source: unknown 30582 1726855293.75630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.75733: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855293.75743: variable 'omit' from source: magic vars 30582 1726855293.75748: starting attempt loop 30582 1726855293.75751: running the handler 30582 1726855293.75792: handler run complete 30582 1726855293.75802: attempt loop complete, returning result 30582 1726855293.75805: _execute() done 30582 1726855293.75808: dumping result to json 30582 1726855293.75810: done dumping result, returning 30582 1726855293.75817: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcc66-ac2b-aa83-7d57-000000000a73] 30582 1726855293.75822: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a73 30582 1726855293.75905: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a73 30582 1726855293.75908: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30582 1726855293.75949: no more pending results, returning what we have 30582 1726855293.75952: results queue empty 30582 1726855293.75953: checking for any_errors_fatal 30582 1726855293.75959: done checking for any_errors_fatal 30582 1726855293.75959: checking for max_fail_percentage 30582 1726855293.75961: done checking for max_fail_percentage 30582 1726855293.75962: checking to see if all hosts have failed and the running result is not ok 30582 1726855293.75963: done checking to see if all hosts have failed 30582 1726855293.75963: getting the remaining hosts for this loop 30582 1726855293.75965: done getting the remaining hosts for this loop 30582 1726855293.75969: getting the next task for host managed_node3 30582 1726855293.75980: done getting next task for host managed_node3 30582 1726855293.75983: ^ task is: TASK: Setup 30582 1726855293.75985: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855293.75992: getting variables 30582 1726855293.75994: in VariableManager get_vars() 30582 1726855293.76023: Calling all_inventory to load vars for managed_node3 30582 1726855293.76026: Calling groups_inventory to load vars for managed_node3 30582 1726855293.76029: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.76039: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.76042: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.76045: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.76860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.77743: done with get_vars() 30582 1726855293.77758: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 14:01:33 -0400 (0:00:00.035) 0:00:30.128 ****** 30582 1726855293.77827: entering _queue_task() for managed_node3/include_tasks 30582 1726855293.78076: worker is 1 (out of 1 available) 30582 1726855293.78090: exiting _queue_task() for managed_node3/include_tasks 30582 1726855293.78102: done queuing things up, now waiting for results queue to drain 30582 1726855293.78104: waiting for pending results... 30582 1726855293.78415: running TaskExecutor() for managed_node3/TASK: Setup 30582 1726855293.78420: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a4c 30582 1726855293.78423: variable 'ansible_search_path' from source: unknown 30582 1726855293.78426: variable 'ansible_search_path' from source: unknown 30582 1726855293.78457: variable 'lsr_setup' from source: include params 30582 1726855293.78663: variable 'lsr_setup' from source: include params 30582 1726855293.78728: variable 'omit' from source: magic vars 30582 1726855293.78992: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.78996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.78998: variable 'omit' from source: magic vars 30582 1726855293.79084: variable 'ansible_distribution_major_version' from source: facts 30582 1726855293.79094: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855293.79101: variable 'item' from source: unknown 30582 1726855293.79156: variable 'item' from source: unknown 30582 1726855293.79191: variable 'item' from source: unknown 30582 1726855293.79243: variable 'item' from source: unknown 30582 1726855293.79534: dumping result to json 30582 1726855293.79538: done dumping result, returning 30582 1726855293.79540: done running TaskExecutor() for managed_node3/TASK: Setup [0affcc66-ac2b-aa83-7d57-000000000a4c] 30582 1726855293.79542: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4c 30582 1726855293.79581: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4c 30582 1726855293.79584: WORKER PROCESS EXITING 30582 1726855293.79606: no more pending results, returning what we have 30582 1726855293.79610: in VariableManager get_vars() 30582 1726855293.79639: Calling all_inventory to load vars for managed_node3 30582 1726855293.79642: Calling groups_inventory to load vars for managed_node3 30582 1726855293.79644: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.79653: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.79656: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.79658: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.80814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.81661: done with get_vars() 30582 1726855293.81676: variable 'ansible_search_path' from source: unknown 30582 1726855293.81677: variable 'ansible_search_path' from source: unknown 30582 1726855293.81706: we have included files to process 30582 1726855293.81707: generating all_blocks data 30582 1726855293.81708: done generating all_blocks data 30582 1726855293.81711: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855293.81711: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855293.81713: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855293.81868: done processing included file 30582 1726855293.81870: iterating over new_blocks loaded from include file 30582 1726855293.81871: in VariableManager get_vars() 30582 1726855293.81885: done with get_vars() 30582 1726855293.81886: filtering new block on tags 30582 1726855293.81910: done filtering new block on tags 30582 1726855293.81912: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30582 1726855293.81915: extending task lists for all hosts with included blocks 30582 1726855293.82262: done extending task lists 30582 1726855293.82263: done processing included files 30582 1726855293.82264: results queue empty 30582 1726855293.82264: checking for any_errors_fatal 30582 1726855293.82266: done checking for any_errors_fatal 30582 1726855293.82267: checking for max_fail_percentage 30582 1726855293.82267: done checking for max_fail_percentage 30582 1726855293.82268: checking to see if all hosts have failed and the running result is not ok 30582 1726855293.82269: done checking to see if all hosts have failed 30582 1726855293.82269: getting the remaining hosts for this loop 30582 1726855293.82270: done getting the remaining hosts for this loop 30582 1726855293.82271: getting the next task for host managed_node3 30582 1726855293.82276: done getting next task for host managed_node3 30582 1726855293.82278: ^ task is: TASK: Include network role 30582 1726855293.82279: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855293.82281: getting variables 30582 1726855293.82282: in VariableManager get_vars() 30582 1726855293.82289: Calling all_inventory to load vars for managed_node3 30582 1726855293.82291: Calling groups_inventory to load vars for managed_node3 30582 1726855293.82292: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.82296: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.82298: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.82300: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.82948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.83803: done with get_vars() 30582 1726855293.83822: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 14:01:33 -0400 (0:00:00.060) 0:00:30.188 ****** 30582 1726855293.83883: entering _queue_task() for managed_node3/include_role 30582 1726855293.84152: worker is 1 (out of 1 available) 30582 1726855293.84165: exiting _queue_task() for managed_node3/include_role 30582 1726855293.84180: done queuing things up, now waiting for results queue to drain 30582 1726855293.84182: waiting for pending results... 30582 1726855293.84364: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855293.84461: in run() - task 0affcc66-ac2b-aa83-7d57-000000000ad1 30582 1726855293.84472: variable 'ansible_search_path' from source: unknown 30582 1726855293.84478: variable 'ansible_search_path' from source: unknown 30582 1726855293.84507: calling self._execute() 30582 1726855293.84580: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.84585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.84594: variable 'omit' from source: magic vars 30582 1726855293.84867: variable 'ansible_distribution_major_version' from source: facts 30582 1726855293.84878: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855293.84882: _execute() done 30582 1726855293.84885: dumping result to json 30582 1726855293.84890: done dumping result, returning 30582 1726855293.84898: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-000000000ad1] 30582 1726855293.84903: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000ad1 30582 1726855293.85013: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000ad1 30582 1726855293.85015: WORKER PROCESS EXITING 30582 1726855293.85045: no more pending results, returning what we have 30582 1726855293.85050: in VariableManager get_vars() 30582 1726855293.85092: Calling all_inventory to load vars for managed_node3 30582 1726855293.85094: Calling groups_inventory to load vars for managed_node3 30582 1726855293.85098: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.85110: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.85113: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.85116: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.85984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.86853: done with get_vars() 30582 1726855293.86869: variable 'ansible_search_path' from source: unknown 30582 1726855293.86870: variable 'ansible_search_path' from source: unknown 30582 1726855293.86993: variable 'omit' from source: magic vars 30582 1726855293.87020: variable 'omit' from source: magic vars 30582 1726855293.87030: variable 'omit' from source: magic vars 30582 1726855293.87033: we have included files to process 30582 1726855293.87034: generating all_blocks data 30582 1726855293.87035: done generating all_blocks data 30582 1726855293.87036: processing included file: fedora.linux_system_roles.network 30582 1726855293.87049: in VariableManager get_vars() 30582 1726855293.87059: done with get_vars() 30582 1726855293.87082: in VariableManager get_vars() 30582 1726855293.87094: done with get_vars() 30582 1726855293.87121: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855293.87196: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855293.87258: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855293.87537: in VariableManager get_vars() 30582 1726855293.87550: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855293.88890: iterating over new_blocks loaded from include file 30582 1726855293.88892: in VariableManager get_vars() 30582 1726855293.88909: done with get_vars() 30582 1726855293.88910: filtering new block on tags 30582 1726855293.89192: done filtering new block on tags 30582 1726855293.89195: in VariableManager get_vars() 30582 1726855293.89210: done with get_vars() 30582 1726855293.89212: filtering new block on tags 30582 1726855293.89228: done filtering new block on tags 30582 1726855293.89229: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855293.89234: extending task lists for all hosts with included blocks 30582 1726855293.89395: done extending task lists 30582 1726855293.89397: done processing included files 30582 1726855293.89397: results queue empty 30582 1726855293.89398: checking for any_errors_fatal 30582 1726855293.89401: done checking for any_errors_fatal 30582 1726855293.89402: checking for max_fail_percentage 30582 1726855293.89403: done checking for max_fail_percentage 30582 1726855293.89404: checking to see if all hosts have failed and the running result is not ok 30582 1726855293.89405: done checking to see if all hosts have failed 30582 1726855293.89406: getting the remaining hosts for this loop 30582 1726855293.89407: done getting the remaining hosts for this loop 30582 1726855293.89409: getting the next task for host managed_node3 30582 1726855293.89414: done getting next task for host managed_node3 30582 1726855293.89417: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855293.89420: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855293.89430: getting variables 30582 1726855293.89431: in VariableManager get_vars() 30582 1726855293.89444: Calling all_inventory to load vars for managed_node3 30582 1726855293.89447: Calling groups_inventory to load vars for managed_node3 30582 1726855293.89449: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.89455: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.89458: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.89462: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.90262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.91136: done with get_vars() 30582 1726855293.91156: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:01:33 -0400 (0:00:00.073) 0:00:30.262 ****** 30582 1726855293.91213: entering _queue_task() for managed_node3/include_tasks 30582 1726855293.91477: worker is 1 (out of 1 available) 30582 1726855293.91491: exiting _queue_task() for managed_node3/include_tasks 30582 1726855293.91501: done queuing things up, now waiting for results queue to drain 30582 1726855293.91503: waiting for pending results... 30582 1726855293.91912: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855293.91922: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b33 30582 1726855293.91931: variable 'ansible_search_path' from source: unknown 30582 1726855293.91938: variable 'ansible_search_path' from source: unknown 30582 1726855293.91976: calling self._execute() 30582 1726855293.92063: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855293.92076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855293.92113: variable 'omit' from source: magic vars 30582 1726855293.92489: variable 'ansible_distribution_major_version' from source: facts 30582 1726855293.92506: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855293.92571: _execute() done 30582 1726855293.92575: dumping result to json 30582 1726855293.92578: done dumping result, returning 30582 1726855293.92581: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-000000000b33] 30582 1726855293.92583: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b33 30582 1726855293.92693: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b33 30582 1726855293.92696: WORKER PROCESS EXITING 30582 1726855293.92778: no more pending results, returning what we have 30582 1726855293.92784: in VariableManager get_vars() 30582 1726855293.92827: Calling all_inventory to load vars for managed_node3 30582 1726855293.92830: Calling groups_inventory to load vars for managed_node3 30582 1726855293.92832: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.92841: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.92844: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.92846: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.94263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.95643: done with get_vars() 30582 1726855293.95658: variable 'ansible_search_path' from source: unknown 30582 1726855293.95659: variable 'ansible_search_path' from source: unknown 30582 1726855293.95692: we have included files to process 30582 1726855293.95693: generating all_blocks data 30582 1726855293.95695: done generating all_blocks data 30582 1726855293.95697: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855293.95698: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855293.95699: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855293.96096: done processing included file 30582 1726855293.96097: iterating over new_blocks loaded from include file 30582 1726855293.96098: in VariableManager get_vars() 30582 1726855293.96114: done with get_vars() 30582 1726855293.96115: filtering new block on tags 30582 1726855293.96136: done filtering new block on tags 30582 1726855293.96137: in VariableManager get_vars() 30582 1726855293.96152: done with get_vars() 30582 1726855293.96153: filtering new block on tags 30582 1726855293.96183: done filtering new block on tags 30582 1726855293.96185: in VariableManager get_vars() 30582 1726855293.96201: done with get_vars() 30582 1726855293.96202: filtering new block on tags 30582 1726855293.96226: done filtering new block on tags 30582 1726855293.96227: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855293.96231: extending task lists for all hosts with included blocks 30582 1726855293.97524: done extending task lists 30582 1726855293.97526: done processing included files 30582 1726855293.97527: results queue empty 30582 1726855293.97527: checking for any_errors_fatal 30582 1726855293.97530: done checking for any_errors_fatal 30582 1726855293.97531: checking for max_fail_percentage 30582 1726855293.97532: done checking for max_fail_percentage 30582 1726855293.97533: checking to see if all hosts have failed and the running result is not ok 30582 1726855293.97534: done checking to see if all hosts have failed 30582 1726855293.97534: getting the remaining hosts for this loop 30582 1726855293.97536: done getting the remaining hosts for this loop 30582 1726855293.97538: getting the next task for host managed_node3 30582 1726855293.97543: done getting next task for host managed_node3 30582 1726855293.97546: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855293.97550: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855293.97560: getting variables 30582 1726855293.97561: in VariableManager get_vars() 30582 1726855293.97579: Calling all_inventory to load vars for managed_node3 30582 1726855293.97582: Calling groups_inventory to load vars for managed_node3 30582 1726855293.97584: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855293.97592: Calling all_plugins_play to load vars for managed_node3 30582 1726855293.97594: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855293.97597: Calling groups_plugins_play to load vars for managed_node3 30582 1726855293.98340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855293.99207: done with get_vars() 30582 1726855293.99225: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:01:33 -0400 (0:00:00.080) 0:00:30.342 ****** 30582 1726855293.99288: entering _queue_task() for managed_node3/setup 30582 1726855293.99557: worker is 1 (out of 1 available) 30582 1726855293.99571: exiting _queue_task() for managed_node3/setup 30582 1726855293.99585: done queuing things up, now waiting for results queue to drain 30582 1726855293.99588: waiting for pending results... 30582 1726855293.99767: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855293.99890: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b90 30582 1726855293.99902: variable 'ansible_search_path' from source: unknown 30582 1726855293.99906: variable 'ansible_search_path' from source: unknown 30582 1726855293.99948: calling self._execute() 30582 1726855294.00193: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855294.00197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855294.00201: variable 'omit' from source: magic vars 30582 1726855294.00421: variable 'ansible_distribution_major_version' from source: facts 30582 1726855294.00431: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855294.00649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855294.02725: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855294.02791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855294.02827: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855294.02860: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855294.02890: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855294.02967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855294.03001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855294.03026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855294.03094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855294.03097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855294.03135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855294.03167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855294.03183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855294.03273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855294.03277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855294.03406: variable '__network_required_facts' from source: role '' defaults 30582 1726855294.03415: variable 'ansible_facts' from source: unknown 30582 1726855294.04302: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855294.04307: when evaluation is False, skipping this task 30582 1726855294.04310: _execute() done 30582 1726855294.04312: dumping result to json 30582 1726855294.04315: done dumping result, returning 30582 1726855294.04365: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-000000000b90] 30582 1726855294.04369: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b90 30582 1726855294.04435: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b90 30582 1726855294.04438: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855294.04482: no more pending results, returning what we have 30582 1726855294.04486: results queue empty 30582 1726855294.04489: checking for any_errors_fatal 30582 1726855294.04490: done checking for any_errors_fatal 30582 1726855294.04491: checking for max_fail_percentage 30582 1726855294.04493: done checking for max_fail_percentage 30582 1726855294.04494: checking to see if all hosts have failed and the running result is not ok 30582 1726855294.04494: done checking to see if all hosts have failed 30582 1726855294.04495: getting the remaining hosts for this loop 30582 1726855294.04496: done getting the remaining hosts for this loop 30582 1726855294.04500: getting the next task for host managed_node3 30582 1726855294.04511: done getting next task for host managed_node3 30582 1726855294.04514: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855294.04520: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855294.04541: getting variables 30582 1726855294.04542: in VariableManager get_vars() 30582 1726855294.04579: Calling all_inventory to load vars for managed_node3 30582 1726855294.04582: Calling groups_inventory to load vars for managed_node3 30582 1726855294.04584: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855294.04597: Calling all_plugins_play to load vars for managed_node3 30582 1726855294.04600: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855294.04609: Calling groups_plugins_play to load vars for managed_node3 30582 1726855294.06164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855294.07853: done with get_vars() 30582 1726855294.07891: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:01:34 -0400 (0:00:00.087) 0:00:30.429 ****** 30582 1726855294.08005: entering _queue_task() for managed_node3/stat 30582 1726855294.08399: worker is 1 (out of 1 available) 30582 1726855294.08411: exiting _queue_task() for managed_node3/stat 30582 1726855294.08424: done queuing things up, now waiting for results queue to drain 30582 1726855294.08425: waiting for pending results... 30582 1726855294.08868: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855294.08876: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b92 30582 1726855294.08899: variable 'ansible_search_path' from source: unknown 30582 1726855294.08906: variable 'ansible_search_path' from source: unknown 30582 1726855294.08947: calling self._execute() 30582 1726855294.09047: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855294.09058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855294.09079: variable 'omit' from source: magic vars 30582 1726855294.09503: variable 'ansible_distribution_major_version' from source: facts 30582 1726855294.09507: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855294.09639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855294.09915: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855294.09968: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855294.10010: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855294.10055: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855294.10142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855294.10178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855294.10211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855294.10265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855294.10341: variable '__network_is_ostree' from source: set_fact 30582 1726855294.10354: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855294.10373: when evaluation is False, skipping this task 30582 1726855294.10376: _execute() done 30582 1726855294.10482: dumping result to json 30582 1726855294.10486: done dumping result, returning 30582 1726855294.10491: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-000000000b92] 30582 1726855294.10494: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b92 30582 1726855294.10565: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b92 30582 1726855294.10570: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855294.10624: no more pending results, returning what we have 30582 1726855294.10628: results queue empty 30582 1726855294.10629: checking for any_errors_fatal 30582 1726855294.10635: done checking for any_errors_fatal 30582 1726855294.10636: checking for max_fail_percentage 30582 1726855294.10638: done checking for max_fail_percentage 30582 1726855294.10639: checking to see if all hosts have failed and the running result is not ok 30582 1726855294.10640: done checking to see if all hosts have failed 30582 1726855294.10640: getting the remaining hosts for this loop 30582 1726855294.10642: done getting the remaining hosts for this loop 30582 1726855294.10646: getting the next task for host managed_node3 30582 1726855294.10656: done getting next task for host managed_node3 30582 1726855294.10659: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855294.10666: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855294.10690: getting variables 30582 1726855294.10692: in VariableManager get_vars() 30582 1726855294.10728: Calling all_inventory to load vars for managed_node3 30582 1726855294.10730: Calling groups_inventory to load vars for managed_node3 30582 1726855294.10732: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855294.10743: Calling all_plugins_play to load vars for managed_node3 30582 1726855294.10745: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855294.10748: Calling groups_plugins_play to load vars for managed_node3 30582 1726855294.12172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855294.13810: done with get_vars() 30582 1726855294.13841: done getting variables 30582 1726855294.13903: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:01:34 -0400 (0:00:00.059) 0:00:30.489 ****** 30582 1726855294.13947: entering _queue_task() for managed_node3/set_fact 30582 1726855294.14521: worker is 1 (out of 1 available) 30582 1726855294.14531: exiting _queue_task() for managed_node3/set_fact 30582 1726855294.14540: done queuing things up, now waiting for results queue to drain 30582 1726855294.14542: waiting for pending results... 30582 1726855294.14673: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855294.14805: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b93 30582 1726855294.14826: variable 'ansible_search_path' from source: unknown 30582 1726855294.14835: variable 'ansible_search_path' from source: unknown 30582 1726855294.14881: calling self._execute() 30582 1726855294.14973: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855294.15095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855294.15098: variable 'omit' from source: magic vars 30582 1726855294.15372: variable 'ansible_distribution_major_version' from source: facts 30582 1726855294.15392: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855294.15568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855294.15857: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855294.15905: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855294.15944: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855294.15991: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855294.16090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855294.16123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855294.16156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855294.16198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855294.16301: variable '__network_is_ostree' from source: set_fact 30582 1726855294.16314: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855294.16323: when evaluation is False, skipping this task 30582 1726855294.16331: _execute() done 30582 1726855294.16339: dumping result to json 30582 1726855294.16391: done dumping result, returning 30582 1726855294.16403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-000000000b93] 30582 1726855294.16405: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b93 30582 1726855294.16473: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b93 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855294.16551: no more pending results, returning what we have 30582 1726855294.16555: results queue empty 30582 1726855294.16556: checking for any_errors_fatal 30582 1726855294.16562: done checking for any_errors_fatal 30582 1726855294.16562: checking for max_fail_percentage 30582 1726855294.16565: done checking for max_fail_percentage 30582 1726855294.16566: checking to see if all hosts have failed and the running result is not ok 30582 1726855294.16566: done checking to see if all hosts have failed 30582 1726855294.16567: getting the remaining hosts for this loop 30582 1726855294.16569: done getting the remaining hosts for this loop 30582 1726855294.16573: getting the next task for host managed_node3 30582 1726855294.16585: done getting next task for host managed_node3 30582 1726855294.16590: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855294.16597: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855294.16619: getting variables 30582 1726855294.16621: in VariableManager get_vars() 30582 1726855294.16659: Calling all_inventory to load vars for managed_node3 30582 1726855294.16664: Calling groups_inventory to load vars for managed_node3 30582 1726855294.16666: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855294.16678: Calling all_plugins_play to load vars for managed_node3 30582 1726855294.16682: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855294.16685: Calling groups_plugins_play to load vars for managed_node3 30582 1726855294.17419: WORKER PROCESS EXITING 30582 1726855294.18459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855294.19337: done with get_vars() 30582 1726855294.19354: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:01:34 -0400 (0:00:00.054) 0:00:30.544 ****** 30582 1726855294.19431: entering _queue_task() for managed_node3/service_facts 30582 1726855294.19693: worker is 1 (out of 1 available) 30582 1726855294.19707: exiting _queue_task() for managed_node3/service_facts 30582 1726855294.19719: done queuing things up, now waiting for results queue to drain 30582 1726855294.19721: waiting for pending results... 30582 1726855294.19911: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855294.20012: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b95 30582 1726855294.20023: variable 'ansible_search_path' from source: unknown 30582 1726855294.20028: variable 'ansible_search_path' from source: unknown 30582 1726855294.20061: calling self._execute() 30582 1726855294.20154: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855294.20158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855294.20169: variable 'omit' from source: magic vars 30582 1726855294.20515: variable 'ansible_distribution_major_version' from source: facts 30582 1726855294.20594: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855294.20597: variable 'omit' from source: magic vars 30582 1726855294.20603: variable 'omit' from source: magic vars 30582 1726855294.20638: variable 'omit' from source: magic vars 30582 1726855294.20679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855294.20712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855294.20730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855294.20747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855294.20759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855294.20862: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855294.20866: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855294.20868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855294.20905: Set connection var ansible_timeout to 10 30582 1726855294.20908: Set connection var ansible_connection to ssh 30582 1726855294.20914: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855294.20919: Set connection var ansible_pipelining to False 30582 1726855294.20924: Set connection var ansible_shell_executable to /bin/sh 30582 1726855294.20927: Set connection var ansible_shell_type to sh 30582 1726855294.20950: variable 'ansible_shell_executable' from source: unknown 30582 1726855294.20953: variable 'ansible_connection' from source: unknown 30582 1726855294.20956: variable 'ansible_module_compression' from source: unknown 30582 1726855294.20958: variable 'ansible_shell_type' from source: unknown 30582 1726855294.20961: variable 'ansible_shell_executable' from source: unknown 30582 1726855294.20963: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855294.20965: variable 'ansible_pipelining' from source: unknown 30582 1726855294.20967: variable 'ansible_timeout' from source: unknown 30582 1726855294.20969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855294.21162: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855294.21171: variable 'omit' from source: magic vars 30582 1726855294.21177: starting attempt loop 30582 1726855294.21179: running the handler 30582 1726855294.21192: _low_level_execute_command(): starting 30582 1726855294.21209: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855294.21916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855294.21919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855294.22000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855294.22005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855294.22033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855294.22116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855294.23809: stdout chunk (state=3): >>>/root <<< 30582 1726855294.23919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855294.23958: stderr chunk (state=3): >>><<< 30582 1726855294.23962: stdout chunk (state=3): >>><<< 30582 1726855294.23975: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855294.23991: _low_level_execute_command(): starting 30582 1726855294.24027: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041 `" && echo ansible-tmp-1726855294.2397807-32039-80543307485041="` echo /root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041 `" ) && sleep 0' 30582 1726855294.24716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855294.24792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855294.24853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855294.26754: stdout chunk (state=3): >>>ansible-tmp-1726855294.2397807-32039-80543307485041=/root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041 <<< 30582 1726855294.26861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855294.26897: stderr chunk (state=3): >>><<< 30582 1726855294.26902: stdout chunk (state=3): >>><<< 30582 1726855294.26915: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855294.2397807-32039-80543307485041=/root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855294.26956: variable 'ansible_module_compression' from source: unknown 30582 1726855294.26994: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855294.27028: variable 'ansible_facts' from source: unknown 30582 1726855294.27085: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/AnsiballZ_service_facts.py 30582 1726855294.27192: Sending initial data 30582 1726855294.27196: Sent initial data (161 bytes) 30582 1726855294.27647: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855294.27651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855294.27653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855294.27655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855294.27657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855294.27712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855294.27715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855294.27718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855294.27780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855294.29366: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855294.29371: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855294.29421: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855294.29505: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpu23ejtdq /root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/AnsiballZ_service_facts.py <<< 30582 1726855294.29509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/AnsiballZ_service_facts.py" <<< 30582 1726855294.29554: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpu23ejtdq" to remote "/root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/AnsiballZ_service_facts.py" <<< 30582 1726855294.30404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855294.30417: stderr chunk (state=3): >>><<< 30582 1726855294.30420: stdout chunk (state=3): >>><<< 30582 1726855294.30452: done transferring module to remote 30582 1726855294.30459: _low_level_execute_command(): starting 30582 1726855294.30461: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/ /root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/AnsiballZ_service_facts.py && sleep 0' 30582 1726855294.30909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855294.30913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855294.30915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855294.30917: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855294.30919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855294.30967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855294.30970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855294.31039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855294.32769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855294.32865: stderr chunk (state=3): >>><<< 30582 1726855294.32868: stdout chunk (state=3): >>><<< 30582 1726855294.32871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855294.32873: _low_level_execute_command(): starting 30582 1726855294.32875: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/AnsiballZ_service_facts.py && sleep 0' 30582 1726855294.33225: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855294.33246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855294.33292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855294.33305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855294.33374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855295.85492: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 30582 1726855295.85523: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855295.87013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855295.87041: stderr chunk (state=3): >>><<< 30582 1726855295.87044: stdout chunk (state=3): >>><<< 30582 1726855295.87077: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855295.87547: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855295.87551: _low_level_execute_command(): starting 30582 1726855295.87562: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855294.2397807-32039-80543307485041/ > /dev/null 2>&1 && sleep 0' 30582 1726855295.88219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855295.88292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855295.88298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855295.88300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855295.88350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855295.90185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855295.90207: stderr chunk (state=3): >>><<< 30582 1726855295.90210: stdout chunk (state=3): >>><<< 30582 1726855295.90223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855295.90229: handler run complete 30582 1726855295.90350: variable 'ansible_facts' from source: unknown 30582 1726855295.90450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855295.90725: variable 'ansible_facts' from source: unknown 30582 1726855295.90805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855295.90918: attempt loop complete, returning result 30582 1726855295.90922: _execute() done 30582 1726855295.90924: dumping result to json 30582 1726855295.90990: done dumping result, returning 30582 1726855295.90994: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000000b95] 30582 1726855295.90996: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b95 30582 1726855295.92082: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b95 30582 1726855295.92086: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855295.92155: no more pending results, returning what we have 30582 1726855295.92157: results queue empty 30582 1726855295.92158: checking for any_errors_fatal 30582 1726855295.92161: done checking for any_errors_fatal 30582 1726855295.92161: checking for max_fail_percentage 30582 1726855295.92162: done checking for max_fail_percentage 30582 1726855295.92163: checking to see if all hosts have failed and the running result is not ok 30582 1726855295.92163: done checking to see if all hosts have failed 30582 1726855295.92164: getting the remaining hosts for this loop 30582 1726855295.92165: done getting the remaining hosts for this loop 30582 1726855295.92167: getting the next task for host managed_node3 30582 1726855295.92172: done getting next task for host managed_node3 30582 1726855295.92175: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855295.92179: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855295.92188: getting variables 30582 1726855295.92189: in VariableManager get_vars() 30582 1726855295.92211: Calling all_inventory to load vars for managed_node3 30582 1726855295.92213: Calling groups_inventory to load vars for managed_node3 30582 1726855295.92214: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855295.92221: Calling all_plugins_play to load vars for managed_node3 30582 1726855295.92224: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855295.92226: Calling groups_plugins_play to load vars for managed_node3 30582 1726855295.93047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855295.93931: done with get_vars() 30582 1726855295.93950: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:01:35 -0400 (0:00:01.745) 0:00:32.290 ****** 30582 1726855295.94028: entering _queue_task() for managed_node3/package_facts 30582 1726855295.94293: worker is 1 (out of 1 available) 30582 1726855295.94308: exiting _queue_task() for managed_node3/package_facts 30582 1726855295.94321: done queuing things up, now waiting for results queue to drain 30582 1726855295.94323: waiting for pending results... 30582 1726855295.94526: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855295.94795: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b96 30582 1726855295.94799: variable 'ansible_search_path' from source: unknown 30582 1726855295.94803: variable 'ansible_search_path' from source: unknown 30582 1726855295.94806: calling self._execute() 30582 1726855295.94811: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855295.94819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855295.94828: variable 'omit' from source: magic vars 30582 1726855295.95214: variable 'ansible_distribution_major_version' from source: facts 30582 1726855295.95226: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855295.95232: variable 'omit' from source: magic vars 30582 1726855295.95318: variable 'omit' from source: magic vars 30582 1726855295.95360: variable 'omit' from source: magic vars 30582 1726855295.95396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855295.95434: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855295.95454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855295.95477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855295.95581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855295.95585: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855295.95589: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855295.95592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855295.95634: Set connection var ansible_timeout to 10 30582 1726855295.95638: Set connection var ansible_connection to ssh 30582 1726855295.95643: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855295.95648: Set connection var ansible_pipelining to False 30582 1726855295.95689: Set connection var ansible_shell_executable to /bin/sh 30582 1726855295.95693: Set connection var ansible_shell_type to sh 30582 1726855295.95695: variable 'ansible_shell_executable' from source: unknown 30582 1726855295.95697: variable 'ansible_connection' from source: unknown 30582 1726855295.95705: variable 'ansible_module_compression' from source: unknown 30582 1726855295.95710: variable 'ansible_shell_type' from source: unknown 30582 1726855295.95713: variable 'ansible_shell_executable' from source: unknown 30582 1726855295.95715: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855295.95717: variable 'ansible_pipelining' from source: unknown 30582 1726855295.95719: variable 'ansible_timeout' from source: unknown 30582 1726855295.95720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855295.95904: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855295.95912: variable 'omit' from source: magic vars 30582 1726855295.95922: starting attempt loop 30582 1726855295.95928: running the handler 30582 1726855295.96034: _low_level_execute_command(): starting 30582 1726855295.96038: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855295.96608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855295.96619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855295.96631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855295.96646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855295.96659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855295.96696: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855295.96703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855295.96706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855295.96708: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855295.96711: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855295.96713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855295.96796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855295.96800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855295.96803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855295.96810: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855295.96813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855295.96820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855295.96833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855295.96857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855295.96949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855295.98598: stdout chunk (state=3): >>>/root <<< 30582 1726855295.98698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855295.98727: stderr chunk (state=3): >>><<< 30582 1726855295.98731: stdout chunk (state=3): >>><<< 30582 1726855295.98751: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855295.98764: _low_level_execute_command(): starting 30582 1726855295.98776: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589 `" && echo ansible-tmp-1726855295.9875062-32119-112798890272589="` echo /root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589 `" ) && sleep 0' 30582 1726855295.99233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855295.99236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855295.99239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855295.99241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855295.99244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855295.99281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855295.99285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855295.99359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855296.01266: stdout chunk (state=3): >>>ansible-tmp-1726855295.9875062-32119-112798890272589=/root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589 <<< 30582 1726855296.01367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855296.01405: stderr chunk (state=3): >>><<< 30582 1726855296.01408: stdout chunk (state=3): >>><<< 30582 1726855296.01423: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855295.9875062-32119-112798890272589=/root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855296.01464: variable 'ansible_module_compression' from source: unknown 30582 1726855296.01512: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855296.01561: variable 'ansible_facts' from source: unknown 30582 1726855296.01682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/AnsiballZ_package_facts.py 30582 1726855296.01792: Sending initial data 30582 1726855296.01796: Sent initial data (162 bytes) 30582 1726855296.02242: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855296.02245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855296.02248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855296.02250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855296.02253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855296.02309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855296.02312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855296.02315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855296.02379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855296.03963: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855296.04021: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855296.04086: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpp70evcf_ /root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/AnsiballZ_package_facts.py <<< 30582 1726855296.04092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/AnsiballZ_package_facts.py" <<< 30582 1726855296.04139: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpp70evcf_" to remote "/root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/AnsiballZ_package_facts.py" <<< 30582 1726855296.05715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855296.05877: stderr chunk (state=3): >>><<< 30582 1726855296.05881: stdout chunk (state=3): >>><<< 30582 1726855296.05884: done transferring module to remote 30582 1726855296.05886: _low_level_execute_command(): starting 30582 1726855296.05895: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/ /root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/AnsiballZ_package_facts.py && sleep 0' 30582 1726855296.06496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855296.06512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855296.06637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855296.06665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855296.06772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855296.08588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855296.08686: stderr chunk (state=3): >>><<< 30582 1726855296.08705: stdout chunk (state=3): >>><<< 30582 1726855296.08800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855296.08804: _low_level_execute_command(): starting 30582 1726855296.08807: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/AnsiballZ_package_facts.py && sleep 0' 30582 1726855296.09378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855296.09399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855296.09416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855296.09443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855296.09543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855296.09571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855296.09668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855296.53698: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30582 1726855296.53717: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30582 1726855296.53782: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30582 1726855296.53805: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30582 1726855296.53817: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30582 1726855296.53822: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30582 1726855296.53827: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30582 1726855296.53862: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30582 1726855296.53884: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30582 1726855296.53892: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855296.55732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855296.55738: stdout chunk (state=3): >>><<< 30582 1726855296.55741: stderr chunk (state=3): >>><<< 30582 1726855296.55818: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855296.57133: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855296.57149: _low_level_execute_command(): starting 30582 1726855296.57152: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855295.9875062-32119-112798890272589/ > /dev/null 2>&1 && sleep 0' 30582 1726855296.57608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855296.57611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855296.57614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855296.57618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855296.57669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855296.57675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855296.57680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855296.57743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855296.60096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855296.60102: stdout chunk (state=3): >>><<< 30582 1726855296.60104: stderr chunk (state=3): >>><<< 30582 1726855296.60107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855296.60109: handler run complete 30582 1726855296.61764: variable 'ansible_facts' from source: unknown 30582 1726855296.62651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855296.67133: variable 'ansible_facts' from source: unknown 30582 1726855296.68217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855296.70052: attempt loop complete, returning result 30582 1726855296.70114: _execute() done 30582 1726855296.70122: dumping result to json 30582 1726855296.70693: done dumping result, returning 30582 1726855296.70697: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000000b96] 30582 1726855296.70700: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b96 30582 1726855296.74950: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b96 30582 1726855296.74954: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855296.75112: no more pending results, returning what we have 30582 1726855296.75115: results queue empty 30582 1726855296.75116: checking for any_errors_fatal 30582 1726855296.75121: done checking for any_errors_fatal 30582 1726855296.75122: checking for max_fail_percentage 30582 1726855296.75124: done checking for max_fail_percentage 30582 1726855296.75125: checking to see if all hosts have failed and the running result is not ok 30582 1726855296.75125: done checking to see if all hosts have failed 30582 1726855296.75126: getting the remaining hosts for this loop 30582 1726855296.75127: done getting the remaining hosts for this loop 30582 1726855296.75131: getting the next task for host managed_node3 30582 1726855296.75138: done getting next task for host managed_node3 30582 1726855296.75142: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855296.75147: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855296.75159: getting variables 30582 1726855296.75160: in VariableManager get_vars() 30582 1726855296.75298: Calling all_inventory to load vars for managed_node3 30582 1726855296.75302: Calling groups_inventory to load vars for managed_node3 30582 1726855296.75304: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855296.75314: Calling all_plugins_play to load vars for managed_node3 30582 1726855296.75317: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855296.75319: Calling groups_plugins_play to load vars for managed_node3 30582 1726855296.77845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855296.81317: done with get_vars() 30582 1726855296.81351: done getting variables 30582 1726855296.81425: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:01:36 -0400 (0:00:00.874) 0:00:33.164 ****** 30582 1726855296.81463: entering _queue_task() for managed_node3/debug 30582 1726855296.82239: worker is 1 (out of 1 available) 30582 1726855296.82254: exiting _queue_task() for managed_node3/debug 30582 1726855296.82267: done queuing things up, now waiting for results queue to drain 30582 1726855296.82269: waiting for pending results... 30582 1726855296.82908: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855296.83047: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b34 30582 1726855296.83246: variable 'ansible_search_path' from source: unknown 30582 1726855296.83250: variable 'ansible_search_path' from source: unknown 30582 1726855296.83253: calling self._execute() 30582 1726855296.83402: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855296.83412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855296.83426: variable 'omit' from source: magic vars 30582 1726855296.84221: variable 'ansible_distribution_major_version' from source: facts 30582 1726855296.84240: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855296.84251: variable 'omit' from source: magic vars 30582 1726855296.84388: variable 'omit' from source: magic vars 30582 1726855296.84604: variable 'network_provider' from source: set_fact 30582 1726855296.84629: variable 'omit' from source: magic vars 30582 1726855296.84872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855296.84876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855296.84898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855296.85003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855296.85020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855296.85056: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855296.85067: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855296.85097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855296.85314: Set connection var ansible_timeout to 10 30582 1726855296.85322: Set connection var ansible_connection to ssh 30582 1726855296.85333: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855296.85400: Set connection var ansible_pipelining to False 30582 1726855296.85414: Set connection var ansible_shell_executable to /bin/sh 30582 1726855296.85421: Set connection var ansible_shell_type to sh 30582 1726855296.85448: variable 'ansible_shell_executable' from source: unknown 30582 1726855296.85455: variable 'ansible_connection' from source: unknown 30582 1726855296.85462: variable 'ansible_module_compression' from source: unknown 30582 1726855296.85468: variable 'ansible_shell_type' from source: unknown 30582 1726855296.85629: variable 'ansible_shell_executable' from source: unknown 30582 1726855296.85631: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855296.85634: variable 'ansible_pipelining' from source: unknown 30582 1726855296.85635: variable 'ansible_timeout' from source: unknown 30582 1726855296.85637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855296.85792: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855296.85862: variable 'omit' from source: magic vars 30582 1726855296.85873: starting attempt loop 30582 1726855296.85879: running the handler 30582 1726855296.86003: handler run complete 30582 1726855296.86023: attempt loop complete, returning result 30582 1726855296.86030: _execute() done 30582 1726855296.86036: dumping result to json 30582 1726855296.86070: done dumping result, returning 30582 1726855296.86083: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-000000000b34] 30582 1726855296.86094: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b34 30582 1726855296.86351: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b34 30582 1726855296.86355: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855296.86455: no more pending results, returning what we have 30582 1726855296.86459: results queue empty 30582 1726855296.86461: checking for any_errors_fatal 30582 1726855296.86471: done checking for any_errors_fatal 30582 1726855296.86472: checking for max_fail_percentage 30582 1726855296.86478: done checking for max_fail_percentage 30582 1726855296.86479: checking to see if all hosts have failed and the running result is not ok 30582 1726855296.86479: done checking to see if all hosts have failed 30582 1726855296.86480: getting the remaining hosts for this loop 30582 1726855296.86482: done getting the remaining hosts for this loop 30582 1726855296.86486: getting the next task for host managed_node3 30582 1726855296.86598: done getting next task for host managed_node3 30582 1726855296.86603: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855296.86607: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855296.86622: getting variables 30582 1726855296.86623: in VariableManager get_vars() 30582 1726855296.86660: Calling all_inventory to load vars for managed_node3 30582 1726855296.86663: Calling groups_inventory to load vars for managed_node3 30582 1726855296.86665: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855296.86677: Calling all_plugins_play to load vars for managed_node3 30582 1726855296.86680: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855296.86682: Calling groups_plugins_play to load vars for managed_node3 30582 1726855296.89321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855296.91632: done with get_vars() 30582 1726855296.91663: done getting variables 30582 1726855296.91729: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:01:36 -0400 (0:00:00.103) 0:00:33.267 ****** 30582 1726855296.91771: entering _queue_task() for managed_node3/fail 30582 1726855296.92137: worker is 1 (out of 1 available) 30582 1726855296.92150: exiting _queue_task() for managed_node3/fail 30582 1726855296.92165: done queuing things up, now waiting for results queue to drain 30582 1726855296.92167: waiting for pending results... 30582 1726855296.92478: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855296.92648: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b35 30582 1726855296.92794: variable 'ansible_search_path' from source: unknown 30582 1726855296.92800: variable 'ansible_search_path' from source: unknown 30582 1726855296.92805: calling self._execute() 30582 1726855296.92892: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855296.92937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855296.92953: variable 'omit' from source: magic vars 30582 1726855296.93812: variable 'ansible_distribution_major_version' from source: facts 30582 1726855296.93817: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855296.94027: variable 'network_state' from source: role '' defaults 30582 1726855296.94045: Evaluated conditional (network_state != {}): False 30582 1726855296.94054: when evaluation is False, skipping this task 30582 1726855296.94064: _execute() done 30582 1726855296.94073: dumping result to json 30582 1726855296.94080: done dumping result, returning 30582 1726855296.94148: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-000000000b35] 30582 1726855296.94152: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b35 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855296.94317: no more pending results, returning what we have 30582 1726855296.94320: results queue empty 30582 1726855296.94321: checking for any_errors_fatal 30582 1726855296.94327: done checking for any_errors_fatal 30582 1726855296.94327: checking for max_fail_percentage 30582 1726855296.94329: done checking for max_fail_percentage 30582 1726855296.94330: checking to see if all hosts have failed and the running result is not ok 30582 1726855296.94331: done checking to see if all hosts have failed 30582 1726855296.94332: getting the remaining hosts for this loop 30582 1726855296.94333: done getting the remaining hosts for this loop 30582 1726855296.94336: getting the next task for host managed_node3 30582 1726855296.94345: done getting next task for host managed_node3 30582 1726855296.94348: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855296.94353: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855296.94378: getting variables 30582 1726855296.94379: in VariableManager get_vars() 30582 1726855296.94420: Calling all_inventory to load vars for managed_node3 30582 1726855296.94423: Calling groups_inventory to load vars for managed_node3 30582 1726855296.94425: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855296.94439: Calling all_plugins_play to load vars for managed_node3 30582 1726855296.94442: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855296.94445: Calling groups_plugins_play to load vars for managed_node3 30582 1726855296.95607: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b35 30582 1726855296.95611: WORKER PROCESS EXITING 30582 1726855297.01978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.03567: done with get_vars() 30582 1726855297.03600: done getting variables 30582 1726855297.03655: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:01:37 -0400 (0:00:00.119) 0:00:33.386 ****** 30582 1726855297.03689: entering _queue_task() for managed_node3/fail 30582 1726855297.04203: worker is 1 (out of 1 available) 30582 1726855297.04214: exiting _queue_task() for managed_node3/fail 30582 1726855297.04225: done queuing things up, now waiting for results queue to drain 30582 1726855297.04227: waiting for pending results... 30582 1726855297.04402: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855297.04579: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b36 30582 1726855297.04603: variable 'ansible_search_path' from source: unknown 30582 1726855297.04673: variable 'ansible_search_path' from source: unknown 30582 1726855297.04677: calling self._execute() 30582 1726855297.04756: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.04769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.04793: variable 'omit' from source: magic vars 30582 1726855297.05180: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.05200: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.05339: variable 'network_state' from source: role '' defaults 30582 1726855297.05356: Evaluated conditional (network_state != {}): False 30582 1726855297.05364: when evaluation is False, skipping this task 30582 1726855297.05429: _execute() done 30582 1726855297.05432: dumping result to json 30582 1726855297.05435: done dumping result, returning 30582 1726855297.05437: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-000000000b36] 30582 1726855297.05441: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b36 30582 1726855297.05529: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b36 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855297.05581: no more pending results, returning what we have 30582 1726855297.05586: results queue empty 30582 1726855297.05589: checking for any_errors_fatal 30582 1726855297.05600: done checking for any_errors_fatal 30582 1726855297.05601: checking for max_fail_percentage 30582 1726855297.05603: done checking for max_fail_percentage 30582 1726855297.05604: checking to see if all hosts have failed and the running result is not ok 30582 1726855297.05605: done checking to see if all hosts have failed 30582 1726855297.05605: getting the remaining hosts for this loop 30582 1726855297.05607: done getting the remaining hosts for this loop 30582 1726855297.05611: getting the next task for host managed_node3 30582 1726855297.05620: done getting next task for host managed_node3 30582 1726855297.05625: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855297.05631: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855297.05657: getting variables 30582 1726855297.05659: in VariableManager get_vars() 30582 1726855297.06003: Calling all_inventory to load vars for managed_node3 30582 1726855297.06006: Calling groups_inventory to load vars for managed_node3 30582 1726855297.06008: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855297.06018: Calling all_plugins_play to load vars for managed_node3 30582 1726855297.06021: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855297.06023: Calling groups_plugins_play to load vars for managed_node3 30582 1726855297.06702: WORKER PROCESS EXITING 30582 1726855297.07371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.09116: done with get_vars() 30582 1726855297.09140: done getting variables 30582 1726855297.09204: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:01:37 -0400 (0:00:00.055) 0:00:33.442 ****** 30582 1726855297.09240: entering _queue_task() for managed_node3/fail 30582 1726855297.09611: worker is 1 (out of 1 available) 30582 1726855297.09625: exiting _queue_task() for managed_node3/fail 30582 1726855297.09637: done queuing things up, now waiting for results queue to drain 30582 1726855297.09639: waiting for pending results... 30582 1726855297.09938: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855297.10095: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b37 30582 1726855297.10121: variable 'ansible_search_path' from source: unknown 30582 1726855297.10129: variable 'ansible_search_path' from source: unknown 30582 1726855297.10169: calling self._execute() 30582 1726855297.10272: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.10286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.10303: variable 'omit' from source: magic vars 30582 1726855297.10682: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.10702: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.10882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855297.13161: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855297.13250: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855297.13293: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855297.13336: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855297.13366: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855297.13452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.13486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.13518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.13565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.13584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.13692: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.13714: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855297.13839: variable 'ansible_distribution' from source: facts 30582 1726855297.13848: variable '__network_rh_distros' from source: role '' defaults 30582 1726855297.13871: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855297.14124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.14151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.14178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.14224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.14243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.14294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.14328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.14356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.14401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.14424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.14471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.14506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.14540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.14628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.14631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.14916: variable 'network_connections' from source: include params 30582 1726855297.14931: variable 'interface' from source: play vars 30582 1726855297.15003: variable 'interface' from source: play vars 30582 1726855297.15020: variable 'network_state' from source: role '' defaults 30582 1726855297.15094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855297.15286: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855297.15330: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855297.15390: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855297.15399: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855297.15444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855297.15471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855297.15601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.15604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855297.15607: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855297.15609: when evaluation is False, skipping this task 30582 1726855297.15612: _execute() done 30582 1726855297.15614: dumping result to json 30582 1726855297.15616: done dumping result, returning 30582 1726855297.15618: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-000000000b37] 30582 1726855297.15620: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b37 30582 1726855297.15805: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b37 30582 1726855297.15808: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855297.15856: no more pending results, returning what we have 30582 1726855297.15860: results queue empty 30582 1726855297.15863: checking for any_errors_fatal 30582 1726855297.15870: done checking for any_errors_fatal 30582 1726855297.15871: checking for max_fail_percentage 30582 1726855297.15874: done checking for max_fail_percentage 30582 1726855297.15875: checking to see if all hosts have failed and the running result is not ok 30582 1726855297.15875: done checking to see if all hosts have failed 30582 1726855297.15876: getting the remaining hosts for this loop 30582 1726855297.15878: done getting the remaining hosts for this loop 30582 1726855297.15883: getting the next task for host managed_node3 30582 1726855297.15896: done getting next task for host managed_node3 30582 1726855297.15901: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855297.15906: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855297.15927: getting variables 30582 1726855297.15929: in VariableManager get_vars() 30582 1726855297.15967: Calling all_inventory to load vars for managed_node3 30582 1726855297.15970: Calling groups_inventory to load vars for managed_node3 30582 1726855297.15973: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855297.15984: Calling all_plugins_play to load vars for managed_node3 30582 1726855297.16189: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855297.16195: Calling groups_plugins_play to load vars for managed_node3 30582 1726855297.17517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.19052: done with get_vars() 30582 1726855297.19080: done getting variables 30582 1726855297.19142: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:01:37 -0400 (0:00:00.099) 0:00:33.541 ****** 30582 1726855297.19178: entering _queue_task() for managed_node3/dnf 30582 1726855297.19531: worker is 1 (out of 1 available) 30582 1726855297.19544: exiting _queue_task() for managed_node3/dnf 30582 1726855297.19556: done queuing things up, now waiting for results queue to drain 30582 1726855297.19558: waiting for pending results... 30582 1726855297.20007: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855297.20013: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b38 30582 1726855297.20024: variable 'ansible_search_path' from source: unknown 30582 1726855297.20032: variable 'ansible_search_path' from source: unknown 30582 1726855297.20070: calling self._execute() 30582 1726855297.20168: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.20181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.20199: variable 'omit' from source: magic vars 30582 1726855297.20589: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.20607: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.20810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855297.23038: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855297.23128: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855297.23173: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855297.23268: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855297.23271: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855297.23324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.23358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.23397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.23440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.23459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.23590: variable 'ansible_distribution' from source: facts 30582 1726855297.23604: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.23625: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855297.23814: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855297.23891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.23924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.23955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.24001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.24021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.24071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.24102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.24135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.24182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.24203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.24251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.24292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.24307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.24349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.24469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.24539: variable 'network_connections' from source: include params 30582 1726855297.24558: variable 'interface' from source: play vars 30582 1726855297.24632: variable 'interface' from source: play vars 30582 1726855297.24715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855297.25232: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855297.25276: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855297.25313: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855297.25350: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855297.25400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855297.25428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855297.25471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.25504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855297.25571: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855297.25829: variable 'network_connections' from source: include params 30582 1726855297.25881: variable 'interface' from source: play vars 30582 1726855297.25911: variable 'interface' from source: play vars 30582 1726855297.25951: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855297.25959: when evaluation is False, skipping this task 30582 1726855297.25967: _execute() done 30582 1726855297.25974: dumping result to json 30582 1726855297.25982: done dumping result, returning 30582 1726855297.26000: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000b38] 30582 1726855297.26092: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b38 30582 1726855297.26168: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b38 30582 1726855297.26171: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855297.26225: no more pending results, returning what we have 30582 1726855297.26229: results queue empty 30582 1726855297.26230: checking for any_errors_fatal 30582 1726855297.26238: done checking for any_errors_fatal 30582 1726855297.26238: checking for max_fail_percentage 30582 1726855297.26240: done checking for max_fail_percentage 30582 1726855297.26241: checking to see if all hosts have failed and the running result is not ok 30582 1726855297.26242: done checking to see if all hosts have failed 30582 1726855297.26243: getting the remaining hosts for this loop 30582 1726855297.26244: done getting the remaining hosts for this loop 30582 1726855297.26249: getting the next task for host managed_node3 30582 1726855297.26258: done getting next task for host managed_node3 30582 1726855297.26263: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855297.26268: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855297.26291: getting variables 30582 1726855297.26294: in VariableManager get_vars() 30582 1726855297.26332: Calling all_inventory to load vars for managed_node3 30582 1726855297.26335: Calling groups_inventory to load vars for managed_node3 30582 1726855297.26337: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855297.26349: Calling all_plugins_play to load vars for managed_node3 30582 1726855297.26352: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855297.26355: Calling groups_plugins_play to load vars for managed_node3 30582 1726855297.28169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.29710: done with get_vars() 30582 1726855297.29740: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855297.29825: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:01:37 -0400 (0:00:00.106) 0:00:33.648 ****** 30582 1726855297.29860: entering _queue_task() for managed_node3/yum 30582 1726855297.30231: worker is 1 (out of 1 available) 30582 1726855297.30246: exiting _queue_task() for managed_node3/yum 30582 1726855297.30259: done queuing things up, now waiting for results queue to drain 30582 1726855297.30262: waiting for pending results... 30582 1726855297.30615: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855297.30711: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b39 30582 1726855297.30715: variable 'ansible_search_path' from source: unknown 30582 1726855297.30893: variable 'ansible_search_path' from source: unknown 30582 1726855297.30896: calling self._execute() 30582 1726855297.30899: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.30902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.30905: variable 'omit' from source: magic vars 30582 1726855297.31257: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.31275: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.31456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855297.33691: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855297.33778: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855297.33823: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855297.33869: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855297.33904: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855297.33990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.34023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.34058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.34104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.34124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.34227: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.34251: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855297.34259: when evaluation is False, skipping this task 30582 1726855297.34273: _execute() done 30582 1726855297.34282: dumping result to json 30582 1726855297.34293: done dumping result, returning 30582 1726855297.34304: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000b39] 30582 1726855297.34382: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b39 30582 1726855297.34460: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b39 30582 1726855297.34463: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855297.34538: no more pending results, returning what we have 30582 1726855297.34542: results queue empty 30582 1726855297.34543: checking for any_errors_fatal 30582 1726855297.34550: done checking for any_errors_fatal 30582 1726855297.34551: checking for max_fail_percentage 30582 1726855297.34554: done checking for max_fail_percentage 30582 1726855297.34555: checking to see if all hosts have failed and the running result is not ok 30582 1726855297.34556: done checking to see if all hosts have failed 30582 1726855297.34557: getting the remaining hosts for this loop 30582 1726855297.34558: done getting the remaining hosts for this loop 30582 1726855297.34562: getting the next task for host managed_node3 30582 1726855297.34572: done getting next task for host managed_node3 30582 1726855297.34577: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855297.34583: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855297.34608: getting variables 30582 1726855297.34610: in VariableManager get_vars() 30582 1726855297.34650: Calling all_inventory to load vars for managed_node3 30582 1726855297.34653: Calling groups_inventory to load vars for managed_node3 30582 1726855297.34655: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855297.34668: Calling all_plugins_play to load vars for managed_node3 30582 1726855297.34671: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855297.34674: Calling groups_plugins_play to load vars for managed_node3 30582 1726855297.36281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.37829: done with get_vars() 30582 1726855297.37862: done getting variables 30582 1726855297.37930: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:01:37 -0400 (0:00:00.081) 0:00:33.729 ****** 30582 1726855297.37967: entering _queue_task() for managed_node3/fail 30582 1726855297.38251: worker is 1 (out of 1 available) 30582 1726855297.38266: exiting _queue_task() for managed_node3/fail 30582 1726855297.38278: done queuing things up, now waiting for results queue to drain 30582 1726855297.38280: waiting for pending results... 30582 1726855297.38472: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855297.38575: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b3a 30582 1726855297.38590: variable 'ansible_search_path' from source: unknown 30582 1726855297.38594: variable 'ansible_search_path' from source: unknown 30582 1726855297.38623: calling self._execute() 30582 1726855297.38701: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.38705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.38716: variable 'omit' from source: magic vars 30582 1726855297.39002: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.39011: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.39099: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855297.39231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855297.41802: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855297.41808: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855297.41811: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855297.41813: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855297.41850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855297.41956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.41995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.42025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.42070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.42095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.42145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.42171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.42204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.42244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.42361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.42364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.42366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.42368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.42400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.42418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.42592: variable 'network_connections' from source: include params 30582 1726855297.42612: variable 'interface' from source: play vars 30582 1726855297.42685: variable 'interface' from source: play vars 30582 1726855297.42758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855297.42922: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855297.42962: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855297.43000: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855297.43045: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855297.43094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855297.43123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855297.43154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.43185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855297.43256: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855297.43572: variable 'network_connections' from source: include params 30582 1726855297.43588: variable 'interface' from source: play vars 30582 1726855297.43657: variable 'interface' from source: play vars 30582 1726855297.43783: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855297.43788: when evaluation is False, skipping this task 30582 1726855297.43791: _execute() done 30582 1726855297.43793: dumping result to json 30582 1726855297.43795: done dumping result, returning 30582 1726855297.43797: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000b3a] 30582 1726855297.43799: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3a 30582 1726855297.43881: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3a 30582 1726855297.44094: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855297.44146: no more pending results, returning what we have 30582 1726855297.44150: results queue empty 30582 1726855297.44151: checking for any_errors_fatal 30582 1726855297.44157: done checking for any_errors_fatal 30582 1726855297.44158: checking for max_fail_percentage 30582 1726855297.44160: done checking for max_fail_percentage 30582 1726855297.44161: checking to see if all hosts have failed and the running result is not ok 30582 1726855297.44162: done checking to see if all hosts have failed 30582 1726855297.44163: getting the remaining hosts for this loop 30582 1726855297.44164: done getting the remaining hosts for this loop 30582 1726855297.44168: getting the next task for host managed_node3 30582 1726855297.44179: done getting next task for host managed_node3 30582 1726855297.44184: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855297.44192: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855297.44213: getting variables 30582 1726855297.44215: in VariableManager get_vars() 30582 1726855297.44253: Calling all_inventory to load vars for managed_node3 30582 1726855297.44256: Calling groups_inventory to load vars for managed_node3 30582 1726855297.44259: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855297.44270: Calling all_plugins_play to load vars for managed_node3 30582 1726855297.44276: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855297.44280: Calling groups_plugins_play to load vars for managed_node3 30582 1726855297.45950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.47543: done with get_vars() 30582 1726855297.47582: done getting variables 30582 1726855297.47646: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:01:37 -0400 (0:00:00.097) 0:00:33.826 ****** 30582 1726855297.47688: entering _queue_task() for managed_node3/package 30582 1726855297.48061: worker is 1 (out of 1 available) 30582 1726855297.48079: exiting _queue_task() for managed_node3/package 30582 1726855297.48294: done queuing things up, now waiting for results queue to drain 30582 1726855297.48296: waiting for pending results... 30582 1726855297.48401: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855297.48567: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b3b 30582 1726855297.48595: variable 'ansible_search_path' from source: unknown 30582 1726855297.48604: variable 'ansible_search_path' from source: unknown 30582 1726855297.48648: calling self._execute() 30582 1726855297.48752: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.48763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.48781: variable 'omit' from source: magic vars 30582 1726855297.49177: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.49198: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.49493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855297.49684: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855297.49740: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855297.49782: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855297.49859: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855297.49985: variable 'network_packages' from source: role '' defaults 30582 1726855297.50104: variable '__network_provider_setup' from source: role '' defaults 30582 1726855297.50154: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855297.50192: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855297.50207: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855297.50279: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855297.50468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855297.52533: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855297.52654: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855297.52657: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855297.52689: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855297.52720: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855297.52814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.52846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.52883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.52926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.52982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.53001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.53026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.53051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.53098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.53117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.53343: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855297.53693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.53696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.53698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.53699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.53701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.53703: variable 'ansible_python' from source: facts 30582 1726855297.53704: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855297.53757: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855297.53848: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855297.53984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.54016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.54050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.54097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.54116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.54170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.54213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.54242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.54294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.54314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.54469: variable 'network_connections' from source: include params 30582 1726855297.54489: variable 'interface' from source: play vars 30582 1726855297.54595: variable 'interface' from source: play vars 30582 1726855297.54670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855297.54710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855297.54745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.54784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855297.54843: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855297.55147: variable 'network_connections' from source: include params 30582 1726855297.55157: variable 'interface' from source: play vars 30582 1726855297.55263: variable 'interface' from source: play vars 30582 1726855297.55322: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855297.55454: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855297.55743: variable 'network_connections' from source: include params 30582 1726855297.55753: variable 'interface' from source: play vars 30582 1726855297.55825: variable 'interface' from source: play vars 30582 1726855297.55854: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855297.55942: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855297.56270: variable 'network_connections' from source: include params 30582 1726855297.56283: variable 'interface' from source: play vars 30582 1726855297.56435: variable 'interface' from source: play vars 30582 1726855297.56438: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855297.56489: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855297.56502: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855297.56565: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855297.56793: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855297.57272: variable 'network_connections' from source: include params 30582 1726855297.57289: variable 'interface' from source: play vars 30582 1726855297.57354: variable 'interface' from source: play vars 30582 1726855297.57369: variable 'ansible_distribution' from source: facts 30582 1726855297.57382: variable '__network_rh_distros' from source: role '' defaults 30582 1726855297.57394: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.57430: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855297.57601: variable 'ansible_distribution' from source: facts 30582 1726855297.57612: variable '__network_rh_distros' from source: role '' defaults 30582 1726855297.57633: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.57640: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855297.57851: variable 'ansible_distribution' from source: facts 30582 1726855297.57854: variable '__network_rh_distros' from source: role '' defaults 30582 1726855297.57857: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.57864: variable 'network_provider' from source: set_fact 30582 1726855297.57891: variable 'ansible_facts' from source: unknown 30582 1726855297.58651: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855297.58660: when evaluation is False, skipping this task 30582 1726855297.58669: _execute() done 30582 1726855297.58680: dumping result to json 30582 1726855297.58689: done dumping result, returning 30582 1726855297.58719: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000000b3b] 30582 1726855297.58722: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3b skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855297.58875: no more pending results, returning what we have 30582 1726855297.58880: results queue empty 30582 1726855297.58881: checking for any_errors_fatal 30582 1726855297.58888: done checking for any_errors_fatal 30582 1726855297.58889: checking for max_fail_percentage 30582 1726855297.58892: done checking for max_fail_percentage 30582 1726855297.58893: checking to see if all hosts have failed and the running result is not ok 30582 1726855297.58893: done checking to see if all hosts have failed 30582 1726855297.58894: getting the remaining hosts for this loop 30582 1726855297.58896: done getting the remaining hosts for this loop 30582 1726855297.58900: getting the next task for host managed_node3 30582 1726855297.58910: done getting next task for host managed_node3 30582 1726855297.58914: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855297.58920: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855297.58941: getting variables 30582 1726855297.58943: in VariableManager get_vars() 30582 1726855297.58986: Calling all_inventory to load vars for managed_node3 30582 1726855297.59293: Calling groups_inventory to load vars for managed_node3 30582 1726855297.59301: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855297.59311: Calling all_plugins_play to load vars for managed_node3 30582 1726855297.59314: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855297.59317: Calling groups_plugins_play to load vars for managed_node3 30582 1726855297.60001: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3b 30582 1726855297.60005: WORKER PROCESS EXITING 30582 1726855297.60749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.62483: done with get_vars() 30582 1726855297.62513: done getting variables 30582 1726855297.62577: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:01:37 -0400 (0:00:00.149) 0:00:33.976 ****** 30582 1726855297.62616: entering _queue_task() for managed_node3/package 30582 1726855297.62988: worker is 1 (out of 1 available) 30582 1726855297.63003: exiting _queue_task() for managed_node3/package 30582 1726855297.63015: done queuing things up, now waiting for results queue to drain 30582 1726855297.63017: waiting for pending results... 30582 1726855297.63321: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855297.63468: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b3c 30582 1726855297.63493: variable 'ansible_search_path' from source: unknown 30582 1726855297.63502: variable 'ansible_search_path' from source: unknown 30582 1726855297.63545: calling self._execute() 30582 1726855297.63649: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.63667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.63685: variable 'omit' from source: magic vars 30582 1726855297.64068: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.64090: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.64221: variable 'network_state' from source: role '' defaults 30582 1726855297.64313: Evaluated conditional (network_state != {}): False 30582 1726855297.64317: when evaluation is False, skipping this task 30582 1726855297.64319: _execute() done 30582 1726855297.64322: dumping result to json 30582 1726855297.64324: done dumping result, returning 30582 1726855297.64327: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000000b3c] 30582 1726855297.64329: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3c 30582 1726855297.64409: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3c 30582 1726855297.64413: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855297.64465: no more pending results, returning what we have 30582 1726855297.64470: results queue empty 30582 1726855297.64471: checking for any_errors_fatal 30582 1726855297.64482: done checking for any_errors_fatal 30582 1726855297.64483: checking for max_fail_percentage 30582 1726855297.64485: done checking for max_fail_percentage 30582 1726855297.64486: checking to see if all hosts have failed and the running result is not ok 30582 1726855297.64488: done checking to see if all hosts have failed 30582 1726855297.64489: getting the remaining hosts for this loop 30582 1726855297.64491: done getting the remaining hosts for this loop 30582 1726855297.64495: getting the next task for host managed_node3 30582 1726855297.64504: done getting next task for host managed_node3 30582 1726855297.64508: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855297.64514: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855297.64539: getting variables 30582 1726855297.64542: in VariableManager get_vars() 30582 1726855297.64583: Calling all_inventory to load vars for managed_node3 30582 1726855297.64690: Calling groups_inventory to load vars for managed_node3 30582 1726855297.64694: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855297.64708: Calling all_plugins_play to load vars for managed_node3 30582 1726855297.64712: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855297.64716: Calling groups_plugins_play to load vars for managed_node3 30582 1726855297.66432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.68041: done with get_vars() 30582 1726855297.68076: done getting variables 30582 1726855297.68142: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:01:37 -0400 (0:00:00.055) 0:00:34.031 ****** 30582 1726855297.68181: entering _queue_task() for managed_node3/package 30582 1726855297.68553: worker is 1 (out of 1 available) 30582 1726855297.68566: exiting _queue_task() for managed_node3/package 30582 1726855297.68580: done queuing things up, now waiting for results queue to drain 30582 1726855297.68582: waiting for pending results... 30582 1726855297.69007: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855297.69027: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b3d 30582 1726855297.69045: variable 'ansible_search_path' from source: unknown 30582 1726855297.69054: variable 'ansible_search_path' from source: unknown 30582 1726855297.69101: calling self._execute() 30582 1726855297.69202: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.69324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.69327: variable 'omit' from source: magic vars 30582 1726855297.69607: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.69624: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.69757: variable 'network_state' from source: role '' defaults 30582 1726855297.69778: Evaluated conditional (network_state != {}): False 30582 1726855297.69789: when evaluation is False, skipping this task 30582 1726855297.69797: _execute() done 30582 1726855297.69804: dumping result to json 30582 1726855297.69811: done dumping result, returning 30582 1726855297.69823: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000000b3d] 30582 1726855297.69834: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3d skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855297.70023: no more pending results, returning what we have 30582 1726855297.70028: results queue empty 30582 1726855297.70029: checking for any_errors_fatal 30582 1726855297.70036: done checking for any_errors_fatal 30582 1726855297.70037: checking for max_fail_percentage 30582 1726855297.70039: done checking for max_fail_percentage 30582 1726855297.70040: checking to see if all hosts have failed and the running result is not ok 30582 1726855297.70041: done checking to see if all hosts have failed 30582 1726855297.70042: getting the remaining hosts for this loop 30582 1726855297.70044: done getting the remaining hosts for this loop 30582 1726855297.70048: getting the next task for host managed_node3 30582 1726855297.70058: done getting next task for host managed_node3 30582 1726855297.70062: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855297.70069: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855297.70097: getting variables 30582 1726855297.70100: in VariableManager get_vars() 30582 1726855297.70140: Calling all_inventory to load vars for managed_node3 30582 1726855297.70143: Calling groups_inventory to load vars for managed_node3 30582 1726855297.70145: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855297.70157: Calling all_plugins_play to load vars for managed_node3 30582 1726855297.70161: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855297.70163: Calling groups_plugins_play to load vars for managed_node3 30582 1726855297.70800: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3d 30582 1726855297.70804: WORKER PROCESS EXITING 30582 1726855297.71832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.73570: done with get_vars() 30582 1726855297.73598: done getting variables 30582 1726855297.73657: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:01:37 -0400 (0:00:00.055) 0:00:34.086 ****** 30582 1726855297.73698: entering _queue_task() for managed_node3/service 30582 1726855297.74065: worker is 1 (out of 1 available) 30582 1726855297.74082: exiting _queue_task() for managed_node3/service 30582 1726855297.74196: done queuing things up, now waiting for results queue to drain 30582 1726855297.74199: waiting for pending results... 30582 1726855297.74406: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855297.74560: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b3e 30582 1726855297.74584: variable 'ansible_search_path' from source: unknown 30582 1726855297.74594: variable 'ansible_search_path' from source: unknown 30582 1726855297.74638: calling self._execute() 30582 1726855297.74743: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.74760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.74797: variable 'omit' from source: magic vars 30582 1726855297.75121: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.75130: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.75216: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855297.75346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855297.77055: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855297.77058: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855297.77295: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855297.77299: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855297.77301: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855297.77304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.77306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.77308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.77352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.77371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.77431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.77460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.77491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.77542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.77562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.77608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.77679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.77723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.77793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.77826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.77974: variable 'network_connections' from source: include params 30582 1726855297.77982: variable 'interface' from source: play vars 30582 1726855297.78054: variable 'interface' from source: play vars 30582 1726855297.78110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855297.78221: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855297.78257: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855297.78283: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855297.78311: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855297.78341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855297.78356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855297.78374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.78397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855297.78442: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855297.78599: variable 'network_connections' from source: include params 30582 1726855297.78602: variable 'interface' from source: play vars 30582 1726855297.78647: variable 'interface' from source: play vars 30582 1726855297.78672: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855297.78676: when evaluation is False, skipping this task 30582 1726855297.78679: _execute() done 30582 1726855297.78682: dumping result to json 30582 1726855297.78686: done dumping result, returning 30582 1726855297.78695: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000b3e] 30582 1726855297.78700: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3e 30582 1726855297.78790: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3e 30582 1726855297.78800: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855297.78843: no more pending results, returning what we have 30582 1726855297.78847: results queue empty 30582 1726855297.78848: checking for any_errors_fatal 30582 1726855297.78858: done checking for any_errors_fatal 30582 1726855297.78859: checking for max_fail_percentage 30582 1726855297.78861: done checking for max_fail_percentage 30582 1726855297.78861: checking to see if all hosts have failed and the running result is not ok 30582 1726855297.78862: done checking to see if all hosts have failed 30582 1726855297.78863: getting the remaining hosts for this loop 30582 1726855297.78864: done getting the remaining hosts for this loop 30582 1726855297.78868: getting the next task for host managed_node3 30582 1726855297.78877: done getting next task for host managed_node3 30582 1726855297.78882: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855297.78886: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855297.78908: getting variables 30582 1726855297.78909: in VariableManager get_vars() 30582 1726855297.78944: Calling all_inventory to load vars for managed_node3 30582 1726855297.78946: Calling groups_inventory to load vars for managed_node3 30582 1726855297.78948: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855297.78958: Calling all_plugins_play to load vars for managed_node3 30582 1726855297.78961: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855297.78963: Calling groups_plugins_play to load vars for managed_node3 30582 1726855297.79788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855297.80668: done with get_vars() 30582 1726855297.80686: done getting variables 30582 1726855297.80732: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:01:37 -0400 (0:00:00.070) 0:00:34.157 ****** 30582 1726855297.80759: entering _queue_task() for managed_node3/service 30582 1726855297.81014: worker is 1 (out of 1 available) 30582 1726855297.81029: exiting _queue_task() for managed_node3/service 30582 1726855297.81041: done queuing things up, now waiting for results queue to drain 30582 1726855297.81043: waiting for pending results... 30582 1726855297.81229: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855297.81318: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b3f 30582 1726855297.81358: variable 'ansible_search_path' from source: unknown 30582 1726855297.81362: variable 'ansible_search_path' from source: unknown 30582 1726855297.81367: calling self._execute() 30582 1726855297.81448: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.81452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.81459: variable 'omit' from source: magic vars 30582 1726855297.81782: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.81993: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855297.81997: variable 'network_provider' from source: set_fact 30582 1726855297.82000: variable 'network_state' from source: role '' defaults 30582 1726855297.82002: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855297.82004: variable 'omit' from source: magic vars 30582 1726855297.82062: variable 'omit' from source: magic vars 30582 1726855297.82097: variable 'network_service_name' from source: role '' defaults 30582 1726855297.82162: variable 'network_service_name' from source: role '' defaults 30582 1726855297.82281: variable '__network_provider_setup' from source: role '' defaults 30582 1726855297.82293: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855297.82359: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855297.82376: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855297.82442: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855297.82700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855297.84410: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855297.84743: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855297.84771: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855297.84800: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855297.84820: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855297.84935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.84939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.84955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.85001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.85013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.85072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.85097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.85118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.85147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.85192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.85407: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855297.85497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.85519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.85542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.85585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.85602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.85686: variable 'ansible_python' from source: facts 30582 1726855297.85703: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855297.85791: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855297.85863: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855297.85986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.86010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.86040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.86079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.86093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.86136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855297.86206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855297.86209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.86218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855297.86238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855297.86385: variable 'network_connections' from source: include params 30582 1726855297.86492: variable 'interface' from source: play vars 30582 1726855297.86496: variable 'interface' from source: play vars 30582 1726855297.86550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855297.86707: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855297.86766: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855297.86813: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855297.86842: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855297.87092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855297.87095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855297.87097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855297.87099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855297.87101: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855297.87319: variable 'network_connections' from source: include params 30582 1726855297.87331: variable 'interface' from source: play vars 30582 1726855297.87407: variable 'interface' from source: play vars 30582 1726855297.87458: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855297.87550: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855297.87833: variable 'network_connections' from source: include params 30582 1726855297.87844: variable 'interface' from source: play vars 30582 1726855297.87918: variable 'interface' from source: play vars 30582 1726855297.87948: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855297.88028: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855297.88226: variable 'network_connections' from source: include params 30582 1726855297.88230: variable 'interface' from source: play vars 30582 1726855297.88282: variable 'interface' from source: play vars 30582 1726855297.88328: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855297.88369: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855297.88378: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855297.88419: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855297.88553: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855297.88863: variable 'network_connections' from source: include params 30582 1726855297.88866: variable 'interface' from source: play vars 30582 1726855297.88913: variable 'interface' from source: play vars 30582 1726855297.88919: variable 'ansible_distribution' from source: facts 30582 1726855297.88922: variable '__network_rh_distros' from source: role '' defaults 30582 1726855297.88927: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.88950: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855297.89063: variable 'ansible_distribution' from source: facts 30582 1726855297.89066: variable '__network_rh_distros' from source: role '' defaults 30582 1726855297.89071: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.89080: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855297.89192: variable 'ansible_distribution' from source: facts 30582 1726855297.89195: variable '__network_rh_distros' from source: role '' defaults 30582 1726855297.89199: variable 'ansible_distribution_major_version' from source: facts 30582 1726855297.89225: variable 'network_provider' from source: set_fact 30582 1726855297.89245: variable 'omit' from source: magic vars 30582 1726855297.89267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855297.89290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855297.89305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855297.89318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855297.89327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855297.89352: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855297.89354: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.89357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.89431: Set connection var ansible_timeout to 10 30582 1726855297.89434: Set connection var ansible_connection to ssh 30582 1726855297.89439: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855297.89444: Set connection var ansible_pipelining to False 30582 1726855297.89455: Set connection var ansible_shell_executable to /bin/sh 30582 1726855297.89458: Set connection var ansible_shell_type to sh 30582 1726855297.89476: variable 'ansible_shell_executable' from source: unknown 30582 1726855297.89479: variable 'ansible_connection' from source: unknown 30582 1726855297.89482: variable 'ansible_module_compression' from source: unknown 30582 1726855297.89484: variable 'ansible_shell_type' from source: unknown 30582 1726855297.89486: variable 'ansible_shell_executable' from source: unknown 30582 1726855297.89490: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855297.89493: variable 'ansible_pipelining' from source: unknown 30582 1726855297.89495: variable 'ansible_timeout' from source: unknown 30582 1726855297.89497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855297.89570: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855297.89580: variable 'omit' from source: magic vars 30582 1726855297.89583: starting attempt loop 30582 1726855297.89586: running the handler 30582 1726855297.89640: variable 'ansible_facts' from source: unknown 30582 1726855297.90235: _low_level_execute_command(): starting 30582 1726855297.90238: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855297.90844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855297.90852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855297.90864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855297.90909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855297.90959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855297.90972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855297.91048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855297.92735: stdout chunk (state=3): >>>/root <<< 30582 1726855297.92838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855297.92865: stderr chunk (state=3): >>><<< 30582 1726855297.92868: stdout chunk (state=3): >>><<< 30582 1726855297.92889: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855297.92900: _low_level_execute_command(): starting 30582 1726855297.92906: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548 `" && echo ansible-tmp-1726855297.9288955-32190-166148606528548="` echo /root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548 `" ) && sleep 0' 30582 1726855297.93460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855297.93465: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855297.93467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855297.93549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855297.93600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855297.95515: stdout chunk (state=3): >>>ansible-tmp-1726855297.9288955-32190-166148606528548=/root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548 <<< 30582 1726855297.95670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855297.95692: stderr chunk (state=3): >>><<< 30582 1726855297.95710: stdout chunk (state=3): >>><<< 30582 1726855297.95742: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855297.9288955-32190-166148606528548=/root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855297.95775: variable 'ansible_module_compression' from source: unknown 30582 1726855297.95864: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855297.95927: variable 'ansible_facts' from source: unknown 30582 1726855297.96138: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/AnsiballZ_systemd.py 30582 1726855297.96323: Sending initial data 30582 1726855297.96325: Sent initial data (156 bytes) 30582 1726855297.96867: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855297.96871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855297.96877: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855297.96885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855297.96890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855297.96963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855297.96970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855297.97045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855297.98625: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855297.98698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855297.98768: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpkfcnxuht /root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/AnsiballZ_systemd.py <<< 30582 1726855297.98780: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/AnsiballZ_systemd.py" <<< 30582 1726855297.98846: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpkfcnxuht" to remote "/root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/AnsiballZ_systemd.py" <<< 30582 1726855297.98849: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/AnsiballZ_systemd.py" <<< 30582 1726855297.99993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855298.00029: stderr chunk (state=3): >>><<< 30582 1726855298.00032: stdout chunk (state=3): >>><<< 30582 1726855298.00075: done transferring module to remote 30582 1726855298.00095: _low_level_execute_command(): starting 30582 1726855298.00099: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/ /root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/AnsiballZ_systemd.py && sleep 0' 30582 1726855298.00557: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855298.00560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855298.00563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855298.00565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855298.00567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.00615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855298.00619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855298.00684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855298.02507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855298.02511: stderr chunk (state=3): >>><<< 30582 1726855298.02514: stdout chunk (state=3): >>><<< 30582 1726855298.02535: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855298.02540: _low_level_execute_command(): starting 30582 1726855298.02542: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/AnsiballZ_systemd.py && sleep 0' 30582 1726855298.03183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855298.03189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.03192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855298.03194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855298.03196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.03273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855298.03277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855298.03279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855298.03345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855298.32408: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10608640", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3327225856", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2071208000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855298.32420: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.s<<< 30582 1726855298.32433: stdout chunk (state=3): >>>ocket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855298.34214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855298.34243: stderr chunk (state=3): >>><<< 30582 1726855298.34246: stdout chunk (state=3): >>><<< 30582 1726855298.34262: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10608640", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3327225856", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2071208000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855298.34392: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855298.34409: _low_level_execute_command(): starting 30582 1726855298.34412: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855297.9288955-32190-166148606528548/ > /dev/null 2>&1 && sleep 0' 30582 1726855298.34861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855298.34864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855298.34866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.34869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855298.34871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.34926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855298.34934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855298.34936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855298.34991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855298.36801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855298.36826: stderr chunk (state=3): >>><<< 30582 1726855298.36829: stdout chunk (state=3): >>><<< 30582 1726855298.36843: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855298.36849: handler run complete 30582 1726855298.36897: attempt loop complete, returning result 30582 1726855298.36902: _execute() done 30582 1726855298.36904: dumping result to json 30582 1726855298.36917: done dumping result, returning 30582 1726855298.36925: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-000000000b3f] 30582 1726855298.36930: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3f 30582 1726855298.37168: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b3f 30582 1726855298.37171: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855298.37230: no more pending results, returning what we have 30582 1726855298.37233: results queue empty 30582 1726855298.37234: checking for any_errors_fatal 30582 1726855298.37240: done checking for any_errors_fatal 30582 1726855298.37240: checking for max_fail_percentage 30582 1726855298.37242: done checking for max_fail_percentage 30582 1726855298.37243: checking to see if all hosts have failed and the running result is not ok 30582 1726855298.37244: done checking to see if all hosts have failed 30582 1726855298.37244: getting the remaining hosts for this loop 30582 1726855298.37246: done getting the remaining hosts for this loop 30582 1726855298.37249: getting the next task for host managed_node3 30582 1726855298.37256: done getting next task for host managed_node3 30582 1726855298.37260: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855298.37264: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855298.37275: getting variables 30582 1726855298.37277: in VariableManager get_vars() 30582 1726855298.37310: Calling all_inventory to load vars for managed_node3 30582 1726855298.37313: Calling groups_inventory to load vars for managed_node3 30582 1726855298.37314: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855298.37324: Calling all_plugins_play to load vars for managed_node3 30582 1726855298.37326: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855298.37329: Calling groups_plugins_play to load vars for managed_node3 30582 1726855298.38401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855298.39281: done with get_vars() 30582 1726855298.39301: done getting variables 30582 1726855298.39344: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:01:38 -0400 (0:00:00.586) 0:00:34.743 ****** 30582 1726855298.39374: entering _queue_task() for managed_node3/service 30582 1726855298.39633: worker is 1 (out of 1 available) 30582 1726855298.39649: exiting _queue_task() for managed_node3/service 30582 1726855298.39661: done queuing things up, now waiting for results queue to drain 30582 1726855298.39663: waiting for pending results... 30582 1726855298.39849: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855298.39943: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b40 30582 1726855298.39954: variable 'ansible_search_path' from source: unknown 30582 1726855298.39957: variable 'ansible_search_path' from source: unknown 30582 1726855298.39990: calling self._execute() 30582 1726855298.40061: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855298.40064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855298.40073: variable 'omit' from source: magic vars 30582 1726855298.40355: variable 'ansible_distribution_major_version' from source: facts 30582 1726855298.40364: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855298.40451: variable 'network_provider' from source: set_fact 30582 1726855298.40456: Evaluated conditional (network_provider == "nm"): True 30582 1726855298.40523: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855298.40590: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855298.40708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855298.42133: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855298.42181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855298.42210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855298.42237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855298.42258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855298.42331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855298.42350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855298.42368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855298.42401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855298.42412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855298.42444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855298.42460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855298.42478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855298.42526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855298.42536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855298.42563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855298.42582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855298.42605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855298.42629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855298.42639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855298.42742: variable 'network_connections' from source: include params 30582 1726855298.42751: variable 'interface' from source: play vars 30582 1726855298.42800: variable 'interface' from source: play vars 30582 1726855298.42853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855298.42964: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855298.42994: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855298.43018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855298.43041: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855298.43072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855298.43091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855298.43108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855298.43125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855298.43163: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855298.43315: variable 'network_connections' from source: include params 30582 1726855298.43319: variable 'interface' from source: play vars 30582 1726855298.43364: variable 'interface' from source: play vars 30582 1726855298.43397: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855298.43400: when evaluation is False, skipping this task 30582 1726855298.43403: _execute() done 30582 1726855298.43405: dumping result to json 30582 1726855298.43407: done dumping result, returning 30582 1726855298.43415: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-000000000b40] 30582 1726855298.43425: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b40 30582 1726855298.43510: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b40 30582 1726855298.43512: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855298.43554: no more pending results, returning what we have 30582 1726855298.43558: results queue empty 30582 1726855298.43559: checking for any_errors_fatal 30582 1726855298.43583: done checking for any_errors_fatal 30582 1726855298.43584: checking for max_fail_percentage 30582 1726855298.43586: done checking for max_fail_percentage 30582 1726855298.43588: checking to see if all hosts have failed and the running result is not ok 30582 1726855298.43589: done checking to see if all hosts have failed 30582 1726855298.43590: getting the remaining hosts for this loop 30582 1726855298.43591: done getting the remaining hosts for this loop 30582 1726855298.43595: getting the next task for host managed_node3 30582 1726855298.43603: done getting next task for host managed_node3 30582 1726855298.43607: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855298.43611: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855298.43631: getting variables 30582 1726855298.43633: in VariableManager get_vars() 30582 1726855298.43668: Calling all_inventory to load vars for managed_node3 30582 1726855298.43671: Calling groups_inventory to load vars for managed_node3 30582 1726855298.43673: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855298.43683: Calling all_plugins_play to load vars for managed_node3 30582 1726855298.43685: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855298.43695: Calling groups_plugins_play to load vars for managed_node3 30582 1726855298.45025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855298.46766: done with get_vars() 30582 1726855298.46799: done getting variables 30582 1726855298.46864: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:01:38 -0400 (0:00:00.075) 0:00:34.818 ****** 30582 1726855298.46909: entering _queue_task() for managed_node3/service 30582 1726855298.47348: worker is 1 (out of 1 available) 30582 1726855298.47365: exiting _queue_task() for managed_node3/service 30582 1726855298.47381: done queuing things up, now waiting for results queue to drain 30582 1726855298.47383: waiting for pending results... 30582 1726855298.47793: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855298.47816: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b41 30582 1726855298.47838: variable 'ansible_search_path' from source: unknown 30582 1726855298.47848: variable 'ansible_search_path' from source: unknown 30582 1726855298.47899: calling self._execute() 30582 1726855298.47995: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855298.48023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855298.48037: variable 'omit' from source: magic vars 30582 1726855298.48336: variable 'ansible_distribution_major_version' from source: facts 30582 1726855298.48346: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855298.48431: variable 'network_provider' from source: set_fact 30582 1726855298.48436: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855298.48439: when evaluation is False, skipping this task 30582 1726855298.48441: _execute() done 30582 1726855298.48444: dumping result to json 30582 1726855298.48447: done dumping result, returning 30582 1726855298.48457: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-000000000b41] 30582 1726855298.48460: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b41 30582 1726855298.48550: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b41 30582 1726855298.48553: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855298.48601: no more pending results, returning what we have 30582 1726855298.48605: results queue empty 30582 1726855298.48606: checking for any_errors_fatal 30582 1726855298.48614: done checking for any_errors_fatal 30582 1726855298.48615: checking for max_fail_percentage 30582 1726855298.48617: done checking for max_fail_percentage 30582 1726855298.48618: checking to see if all hosts have failed and the running result is not ok 30582 1726855298.48619: done checking to see if all hosts have failed 30582 1726855298.48619: getting the remaining hosts for this loop 30582 1726855298.48621: done getting the remaining hosts for this loop 30582 1726855298.48624: getting the next task for host managed_node3 30582 1726855298.48633: done getting next task for host managed_node3 30582 1726855298.48637: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855298.48641: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855298.48666: getting variables 30582 1726855298.48668: in VariableManager get_vars() 30582 1726855298.48703: Calling all_inventory to load vars for managed_node3 30582 1726855298.48706: Calling groups_inventory to load vars for managed_node3 30582 1726855298.48707: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855298.48717: Calling all_plugins_play to load vars for managed_node3 30582 1726855298.48719: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855298.48722: Calling groups_plugins_play to load vars for managed_node3 30582 1726855298.49503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855298.50698: done with get_vars() 30582 1726855298.50724: done getting variables 30582 1726855298.50785: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:01:38 -0400 (0:00:00.039) 0:00:34.858 ****** 30582 1726855298.50824: entering _queue_task() for managed_node3/copy 30582 1726855298.51162: worker is 1 (out of 1 available) 30582 1726855298.51181: exiting _queue_task() for managed_node3/copy 30582 1726855298.51199: done queuing things up, now waiting for results queue to drain 30582 1726855298.51201: waiting for pending results... 30582 1726855298.51447: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855298.51537: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b42 30582 1726855298.51548: variable 'ansible_search_path' from source: unknown 30582 1726855298.51552: variable 'ansible_search_path' from source: unknown 30582 1726855298.51585: calling self._execute() 30582 1726855298.51656: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855298.51659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855298.51668: variable 'omit' from source: magic vars 30582 1726855298.51954: variable 'ansible_distribution_major_version' from source: facts 30582 1726855298.51963: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855298.52049: variable 'network_provider' from source: set_fact 30582 1726855298.52053: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855298.52056: when evaluation is False, skipping this task 30582 1726855298.52059: _execute() done 30582 1726855298.52061: dumping result to json 30582 1726855298.52064: done dumping result, returning 30582 1726855298.52074: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-000000000b42] 30582 1726855298.52077: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b42 30582 1726855298.52168: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b42 30582 1726855298.52171: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855298.52230: no more pending results, returning what we have 30582 1726855298.52234: results queue empty 30582 1726855298.52235: checking for any_errors_fatal 30582 1726855298.52240: done checking for any_errors_fatal 30582 1726855298.52241: checking for max_fail_percentage 30582 1726855298.52243: done checking for max_fail_percentage 30582 1726855298.52244: checking to see if all hosts have failed and the running result is not ok 30582 1726855298.52244: done checking to see if all hosts have failed 30582 1726855298.52245: getting the remaining hosts for this loop 30582 1726855298.52247: done getting the remaining hosts for this loop 30582 1726855298.52251: getting the next task for host managed_node3 30582 1726855298.52259: done getting next task for host managed_node3 30582 1726855298.52263: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855298.52268: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855298.52291: getting variables 30582 1726855298.52293: in VariableManager get_vars() 30582 1726855298.52324: Calling all_inventory to load vars for managed_node3 30582 1726855298.52326: Calling groups_inventory to load vars for managed_node3 30582 1726855298.52328: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855298.52337: Calling all_plugins_play to load vars for managed_node3 30582 1726855298.52339: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855298.52342: Calling groups_plugins_play to load vars for managed_node3 30582 1726855298.53381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855298.54996: done with get_vars() 30582 1726855298.55019: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:01:38 -0400 (0:00:00.042) 0:00:34.900 ****** 30582 1726855298.55102: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855298.55412: worker is 1 (out of 1 available) 30582 1726855298.55426: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855298.55437: done queuing things up, now waiting for results queue to drain 30582 1726855298.55439: waiting for pending results... 30582 1726855298.55815: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855298.55895: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b43 30582 1726855298.55899: variable 'ansible_search_path' from source: unknown 30582 1726855298.55901: variable 'ansible_search_path' from source: unknown 30582 1726855298.55933: calling self._execute() 30582 1726855298.56034: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855298.56093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855298.56097: variable 'omit' from source: magic vars 30582 1726855298.56445: variable 'ansible_distribution_major_version' from source: facts 30582 1726855298.56467: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855298.56479: variable 'omit' from source: magic vars 30582 1726855298.56563: variable 'omit' from source: magic vars 30582 1726855298.56717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855298.58894: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855298.58899: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855298.58901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855298.58935: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855298.58968: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855298.59058: variable 'network_provider' from source: set_fact 30582 1726855298.59201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855298.59237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855298.59268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855298.59318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855298.59352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855298.59470: variable 'omit' from source: magic vars 30582 1726855298.59609: variable 'omit' from source: magic vars 30582 1726855298.59731: variable 'network_connections' from source: include params 30582 1726855298.59749: variable 'interface' from source: play vars 30582 1726855298.59819: variable 'interface' from source: play vars 30582 1726855298.59993: variable 'omit' from source: magic vars 30582 1726855298.59996: variable '__lsr_ansible_managed' from source: task vars 30582 1726855298.60044: variable '__lsr_ansible_managed' from source: task vars 30582 1726855298.60299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855298.60692: Loaded config def from plugin (lookup/template) 30582 1726855298.60697: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855298.60700: File lookup term: get_ansible_managed.j2 30582 1726855298.60702: variable 'ansible_search_path' from source: unknown 30582 1726855298.60705: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855298.60711: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855298.60714: variable 'ansible_search_path' from source: unknown 30582 1726855298.70514: variable 'ansible_managed' from source: unknown 30582 1726855298.70658: variable 'omit' from source: magic vars 30582 1726855298.70815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855298.70892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855298.70918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855298.71098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855298.71101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855298.71103: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855298.71105: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855298.71107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855298.71290: Set connection var ansible_timeout to 10 30582 1726855298.71299: Set connection var ansible_connection to ssh 30582 1726855298.71493: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855298.71496: Set connection var ansible_pipelining to False 30582 1726855298.71498: Set connection var ansible_shell_executable to /bin/sh 30582 1726855298.71499: Set connection var ansible_shell_type to sh 30582 1726855298.71501: variable 'ansible_shell_executable' from source: unknown 30582 1726855298.71502: variable 'ansible_connection' from source: unknown 30582 1726855298.71504: variable 'ansible_module_compression' from source: unknown 30582 1726855298.71506: variable 'ansible_shell_type' from source: unknown 30582 1726855298.71507: variable 'ansible_shell_executable' from source: unknown 30582 1726855298.71509: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855298.71511: variable 'ansible_pipelining' from source: unknown 30582 1726855298.71513: variable 'ansible_timeout' from source: unknown 30582 1726855298.71514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855298.71776: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855298.71803: variable 'omit' from source: magic vars 30582 1726855298.71815: starting attempt loop 30582 1726855298.71821: running the handler 30582 1726855298.71839: _low_level_execute_command(): starting 30582 1726855298.71849: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855298.72564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855298.72583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855298.72602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855298.72656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.72740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855298.72779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855298.72873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855298.74562: stdout chunk (state=3): >>>/root <<< 30582 1726855298.74704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855298.74797: stdout chunk (state=3): >>><<< 30582 1726855298.74953: stderr chunk (state=3): >>><<< 30582 1726855298.74973: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855298.74990: _low_level_execute_command(): starting 30582 1726855298.74997: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983 `" && echo ansible-tmp-1726855298.749762-32216-64261284834983="` echo /root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983 `" ) && sleep 0' 30582 1726855298.75792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855298.75796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855298.75798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855298.75801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855298.75803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855298.75806: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855298.75808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.75811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855298.75986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855298.76051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855298.77947: stdout chunk (state=3): >>>ansible-tmp-1726855298.749762-32216-64261284834983=/root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983 <<< 30582 1726855298.78295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855298.78298: stdout chunk (state=3): >>><<< 30582 1726855298.78300: stderr chunk (state=3): >>><<< 30582 1726855298.78303: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855298.749762-32216-64261284834983=/root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855298.78305: variable 'ansible_module_compression' from source: unknown 30582 1726855298.78307: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855298.78309: variable 'ansible_facts' from source: unknown 30582 1726855298.78414: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/AnsiballZ_network_connections.py 30582 1726855298.78609: Sending initial data 30582 1726855298.78612: Sent initial data (166 bytes) 30582 1726855298.79124: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855298.79133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855298.79144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855298.79158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855298.79170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855298.79185: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855298.79194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.79206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855298.79292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855298.79320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855298.79421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855298.80995: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855298.81049: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855298.81104: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpjjzsac_h /root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/AnsiballZ_network_connections.py <<< 30582 1726855298.81107: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/AnsiballZ_network_connections.py" <<< 30582 1726855298.81165: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpjjzsac_h" to remote "/root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/AnsiballZ_network_connections.py" <<< 30582 1726855298.81936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855298.81976: stderr chunk (state=3): >>><<< 30582 1726855298.81981: stdout chunk (state=3): >>><<< 30582 1726855298.82006: done transferring module to remote 30582 1726855298.82016: _low_level_execute_command(): starting 30582 1726855298.82018: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/ /root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/AnsiballZ_network_connections.py && sleep 0' 30582 1726855298.82562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855298.82567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.82569: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855298.82572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.82574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855298.82576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855298.82617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855298.82686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855298.84419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855298.84459: stderr chunk (state=3): >>><<< 30582 1726855298.84465: stdout chunk (state=3): >>><<< 30582 1726855298.84486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855298.84496: _low_level_execute_command(): starting 30582 1726855298.84502: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/AnsiballZ_network_connections.py && sleep 0' 30582 1726855298.84906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855298.84920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855298.84931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855298.84980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855298.84995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855298.85064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855299.13328: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855299.16435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855299.16456: stderr chunk (state=3): >>><<< 30582 1726855299.16459: stdout chunk (state=3): >>><<< 30582 1726855299.16499: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855299.16522: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855299.16530: _low_level_execute_command(): starting 30582 1726855299.16535: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855298.749762-32216-64261284834983/ > /dev/null 2>&1 && sleep 0' 30582 1726855299.17151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855299.17155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855299.17170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855299.17199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855299.17252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855299.19267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855299.19298: stderr chunk (state=3): >>><<< 30582 1726855299.19302: stdout chunk (state=3): >>><<< 30582 1726855299.19320: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855299.19325: handler run complete 30582 1726855299.19346: attempt loop complete, returning result 30582 1726855299.19348: _execute() done 30582 1726855299.19351: dumping result to json 30582 1726855299.19356: done dumping result, returning 30582 1726855299.19364: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-000000000b43] 30582 1726855299.19370: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b43 30582 1726855299.19471: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b43 30582 1726855299.19474: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508 30582 1726855299.19580: no more pending results, returning what we have 30582 1726855299.19599: results queue empty 30582 1726855299.19600: checking for any_errors_fatal 30582 1726855299.19609: done checking for any_errors_fatal 30582 1726855299.19609: checking for max_fail_percentage 30582 1726855299.19611: done checking for max_fail_percentage 30582 1726855299.19612: checking to see if all hosts have failed and the running result is not ok 30582 1726855299.19613: done checking to see if all hosts have failed 30582 1726855299.19613: getting the remaining hosts for this loop 30582 1726855299.19615: done getting the remaining hosts for this loop 30582 1726855299.19618: getting the next task for host managed_node3 30582 1726855299.19626: done getting next task for host managed_node3 30582 1726855299.19629: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855299.19633: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855299.19646: getting variables 30582 1726855299.19648: in VariableManager get_vars() 30582 1726855299.19690: Calling all_inventory to load vars for managed_node3 30582 1726855299.19693: Calling groups_inventory to load vars for managed_node3 30582 1726855299.19695: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855299.19709: Calling all_plugins_play to load vars for managed_node3 30582 1726855299.19712: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855299.19714: Calling groups_plugins_play to load vars for managed_node3 30582 1726855299.20757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855299.21842: done with get_vars() 30582 1726855299.21859: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:01:39 -0400 (0:00:00.668) 0:00:35.569 ****** 30582 1726855299.21935: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855299.22186: worker is 1 (out of 1 available) 30582 1726855299.22200: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855299.22213: done queuing things up, now waiting for results queue to drain 30582 1726855299.22215: waiting for pending results... 30582 1726855299.22525: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855299.22622: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b44 30582 1726855299.22627: variable 'ansible_search_path' from source: unknown 30582 1726855299.22630: variable 'ansible_search_path' from source: unknown 30582 1726855299.22656: calling self._execute() 30582 1726855299.22761: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.22764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.22782: variable 'omit' from source: magic vars 30582 1726855299.23192: variable 'ansible_distribution_major_version' from source: facts 30582 1726855299.23195: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855299.23300: variable 'network_state' from source: role '' defaults 30582 1726855299.23308: Evaluated conditional (network_state != {}): False 30582 1726855299.23311: when evaluation is False, skipping this task 30582 1726855299.23317: _execute() done 30582 1726855299.23320: dumping result to json 30582 1726855299.23322: done dumping result, returning 30582 1726855299.23331: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-000000000b44] 30582 1726855299.23335: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b44 30582 1726855299.23426: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b44 30582 1726855299.23428: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855299.23476: no more pending results, returning what we have 30582 1726855299.23480: results queue empty 30582 1726855299.23481: checking for any_errors_fatal 30582 1726855299.23495: done checking for any_errors_fatal 30582 1726855299.23495: checking for max_fail_percentage 30582 1726855299.23497: done checking for max_fail_percentage 30582 1726855299.23498: checking to see if all hosts have failed and the running result is not ok 30582 1726855299.23499: done checking to see if all hosts have failed 30582 1726855299.23500: getting the remaining hosts for this loop 30582 1726855299.23502: done getting the remaining hosts for this loop 30582 1726855299.23505: getting the next task for host managed_node3 30582 1726855299.23513: done getting next task for host managed_node3 30582 1726855299.23517: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855299.23522: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855299.23544: getting variables 30582 1726855299.23545: in VariableManager get_vars() 30582 1726855299.23577: Calling all_inventory to load vars for managed_node3 30582 1726855299.23579: Calling groups_inventory to load vars for managed_node3 30582 1726855299.23581: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855299.23640: Calling all_plugins_play to load vars for managed_node3 30582 1726855299.23643: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855299.23647: Calling groups_plugins_play to load vars for managed_node3 30582 1726855299.24801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855299.25654: done with get_vars() 30582 1726855299.25686: done getting variables 30582 1726855299.25739: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:01:39 -0400 (0:00:00.038) 0:00:35.607 ****** 30582 1726855299.25763: entering _queue_task() for managed_node3/debug 30582 1726855299.26009: worker is 1 (out of 1 available) 30582 1726855299.26025: exiting _queue_task() for managed_node3/debug 30582 1726855299.26037: done queuing things up, now waiting for results queue to drain 30582 1726855299.26039: waiting for pending results... 30582 1726855299.26226: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855299.26346: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b45 30582 1726855299.26593: variable 'ansible_search_path' from source: unknown 30582 1726855299.26596: variable 'ansible_search_path' from source: unknown 30582 1726855299.26600: calling self._execute() 30582 1726855299.26603: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.26606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.26608: variable 'omit' from source: magic vars 30582 1726855299.26843: variable 'ansible_distribution_major_version' from source: facts 30582 1726855299.26851: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855299.26859: variable 'omit' from source: magic vars 30582 1726855299.26972: variable 'omit' from source: magic vars 30582 1726855299.27197: variable 'omit' from source: magic vars 30582 1726855299.27203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855299.27210: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855299.27217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855299.27228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855299.27255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855299.27320: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855299.27333: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.27352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.27525: Set connection var ansible_timeout to 10 30582 1726855299.27536: Set connection var ansible_connection to ssh 30582 1726855299.27559: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855299.27579: Set connection var ansible_pipelining to False 30582 1726855299.27590: Set connection var ansible_shell_executable to /bin/sh 30582 1726855299.27599: Set connection var ansible_shell_type to sh 30582 1726855299.27615: variable 'ansible_shell_executable' from source: unknown 30582 1726855299.27618: variable 'ansible_connection' from source: unknown 30582 1726855299.27621: variable 'ansible_module_compression' from source: unknown 30582 1726855299.27623: variable 'ansible_shell_type' from source: unknown 30582 1726855299.27625: variable 'ansible_shell_executable' from source: unknown 30582 1726855299.27627: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.27631: variable 'ansible_pipelining' from source: unknown 30582 1726855299.27634: variable 'ansible_timeout' from source: unknown 30582 1726855299.27638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.27867: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855299.27871: variable 'omit' from source: magic vars 30582 1726855299.27875: starting attempt loop 30582 1726855299.27878: running the handler 30582 1726855299.28092: variable '__network_connections_result' from source: set_fact 30582 1726855299.28095: handler run complete 30582 1726855299.28101: attempt loop complete, returning result 30582 1726855299.28109: _execute() done 30582 1726855299.28118: dumping result to json 30582 1726855299.28120: done dumping result, returning 30582 1726855299.28130: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000000b45] 30582 1726855299.28134: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b45 30582 1726855299.28217: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b45 30582 1726855299.28220: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508" ] } 30582 1726855299.28329: no more pending results, returning what we have 30582 1726855299.28332: results queue empty 30582 1726855299.28333: checking for any_errors_fatal 30582 1726855299.28340: done checking for any_errors_fatal 30582 1726855299.28341: checking for max_fail_percentage 30582 1726855299.28342: done checking for max_fail_percentage 30582 1726855299.28343: checking to see if all hosts have failed and the running result is not ok 30582 1726855299.28344: done checking to see if all hosts have failed 30582 1726855299.28345: getting the remaining hosts for this loop 30582 1726855299.28346: done getting the remaining hosts for this loop 30582 1726855299.28349: getting the next task for host managed_node3 30582 1726855299.28356: done getting next task for host managed_node3 30582 1726855299.28360: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855299.28364: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855299.28377: getting variables 30582 1726855299.28378: in VariableManager get_vars() 30582 1726855299.28408: Calling all_inventory to load vars for managed_node3 30582 1726855299.28411: Calling groups_inventory to load vars for managed_node3 30582 1726855299.28413: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855299.28453: Calling all_plugins_play to load vars for managed_node3 30582 1726855299.28456: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855299.28459: Calling groups_plugins_play to load vars for managed_node3 30582 1726855299.29801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855299.31034: done with get_vars() 30582 1726855299.31054: done getting variables 30582 1726855299.31100: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:01:39 -0400 (0:00:00.053) 0:00:35.661 ****** 30582 1726855299.31129: entering _queue_task() for managed_node3/debug 30582 1726855299.31366: worker is 1 (out of 1 available) 30582 1726855299.31381: exiting _queue_task() for managed_node3/debug 30582 1726855299.31395: done queuing things up, now waiting for results queue to drain 30582 1726855299.31397: waiting for pending results... 30582 1726855299.31579: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855299.31667: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b46 30582 1726855299.31680: variable 'ansible_search_path' from source: unknown 30582 1726855299.31684: variable 'ansible_search_path' from source: unknown 30582 1726855299.31713: calling self._execute() 30582 1726855299.31780: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.31784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.31794: variable 'omit' from source: magic vars 30582 1726855299.32055: variable 'ansible_distribution_major_version' from source: facts 30582 1726855299.32065: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855299.32071: variable 'omit' from source: magic vars 30582 1726855299.32114: variable 'omit' from source: magic vars 30582 1726855299.32137: variable 'omit' from source: magic vars 30582 1726855299.32169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855299.32198: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855299.32213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855299.32227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855299.32236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855299.32259: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855299.32262: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.32265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.32339: Set connection var ansible_timeout to 10 30582 1726855299.32342: Set connection var ansible_connection to ssh 30582 1726855299.32347: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855299.32351: Set connection var ansible_pipelining to False 30582 1726855299.32356: Set connection var ansible_shell_executable to /bin/sh 30582 1726855299.32358: Set connection var ansible_shell_type to sh 30582 1726855299.32403: variable 'ansible_shell_executable' from source: unknown 30582 1726855299.32407: variable 'ansible_connection' from source: unknown 30582 1726855299.32410: variable 'ansible_module_compression' from source: unknown 30582 1726855299.32412: variable 'ansible_shell_type' from source: unknown 30582 1726855299.32414: variable 'ansible_shell_executable' from source: unknown 30582 1726855299.32416: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.32418: variable 'ansible_pipelining' from source: unknown 30582 1726855299.32420: variable 'ansible_timeout' from source: unknown 30582 1726855299.32421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.32692: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855299.32696: variable 'omit' from source: magic vars 30582 1726855299.32698: starting attempt loop 30582 1726855299.32700: running the handler 30582 1726855299.32702: variable '__network_connections_result' from source: set_fact 30582 1726855299.32704: variable '__network_connections_result' from source: set_fact 30582 1726855299.32815: handler run complete 30582 1726855299.32846: attempt loop complete, returning result 30582 1726855299.32854: _execute() done 30582 1726855299.32861: dumping result to json 30582 1726855299.32871: done dumping result, returning 30582 1726855299.32885: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000000b46] 30582 1726855299.32896: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b46 30582 1726855299.33003: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b46 30582 1726855299.33009: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508" ] } } 30582 1726855299.33456: no more pending results, returning what we have 30582 1726855299.33459: results queue empty 30582 1726855299.33460: checking for any_errors_fatal 30582 1726855299.33466: done checking for any_errors_fatal 30582 1726855299.33467: checking for max_fail_percentage 30582 1726855299.33469: done checking for max_fail_percentage 30582 1726855299.33469: checking to see if all hosts have failed and the running result is not ok 30582 1726855299.33470: done checking to see if all hosts have failed 30582 1726855299.33471: getting the remaining hosts for this loop 30582 1726855299.33472: done getting the remaining hosts for this loop 30582 1726855299.33477: getting the next task for host managed_node3 30582 1726855299.33483: done getting next task for host managed_node3 30582 1726855299.33486: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855299.33491: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855299.33502: getting variables 30582 1726855299.33503: in VariableManager get_vars() 30582 1726855299.33537: Calling all_inventory to load vars for managed_node3 30582 1726855299.33540: Calling groups_inventory to load vars for managed_node3 30582 1726855299.33541: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855299.33549: Calling all_plugins_play to load vars for managed_node3 30582 1726855299.33551: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855299.33553: Calling groups_plugins_play to load vars for managed_node3 30582 1726855299.35033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855299.36552: done with get_vars() 30582 1726855299.36574: done getting variables 30582 1726855299.36636: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:01:39 -0400 (0:00:00.055) 0:00:35.716 ****** 30582 1726855299.36672: entering _queue_task() for managed_node3/debug 30582 1726855299.37022: worker is 1 (out of 1 available) 30582 1726855299.37034: exiting _queue_task() for managed_node3/debug 30582 1726855299.37049: done queuing things up, now waiting for results queue to drain 30582 1726855299.37051: waiting for pending results... 30582 1726855299.37340: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855299.37513: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b47 30582 1726855299.37517: variable 'ansible_search_path' from source: unknown 30582 1726855299.37520: variable 'ansible_search_path' from source: unknown 30582 1726855299.37593: calling self._execute() 30582 1726855299.37655: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.37666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.37680: variable 'omit' from source: magic vars 30582 1726855299.37986: variable 'ansible_distribution_major_version' from source: facts 30582 1726855299.37996: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855299.38079: variable 'network_state' from source: role '' defaults 30582 1726855299.38086: Evaluated conditional (network_state != {}): False 30582 1726855299.38091: when evaluation is False, skipping this task 30582 1726855299.38093: _execute() done 30582 1726855299.38096: dumping result to json 30582 1726855299.38099: done dumping result, returning 30582 1726855299.38107: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-000000000b47] 30582 1726855299.38112: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b47 30582 1726855299.38197: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b47 30582 1726855299.38199: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855299.38243: no more pending results, returning what we have 30582 1726855299.38247: results queue empty 30582 1726855299.38248: checking for any_errors_fatal 30582 1726855299.38258: done checking for any_errors_fatal 30582 1726855299.38259: checking for max_fail_percentage 30582 1726855299.38261: done checking for max_fail_percentage 30582 1726855299.38262: checking to see if all hosts have failed and the running result is not ok 30582 1726855299.38263: done checking to see if all hosts have failed 30582 1726855299.38264: getting the remaining hosts for this loop 30582 1726855299.38265: done getting the remaining hosts for this loop 30582 1726855299.38269: getting the next task for host managed_node3 30582 1726855299.38280: done getting next task for host managed_node3 30582 1726855299.38284: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855299.38290: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855299.38310: getting variables 30582 1726855299.38311: in VariableManager get_vars() 30582 1726855299.38340: Calling all_inventory to load vars for managed_node3 30582 1726855299.38343: Calling groups_inventory to load vars for managed_node3 30582 1726855299.38345: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855299.38353: Calling all_plugins_play to load vars for managed_node3 30582 1726855299.38356: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855299.38359: Calling groups_plugins_play to load vars for managed_node3 30582 1726855299.39116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855299.44258: done with get_vars() 30582 1726855299.44294: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:01:39 -0400 (0:00:00.077) 0:00:35.793 ****** 30582 1726855299.44384: entering _queue_task() for managed_node3/ping 30582 1726855299.44750: worker is 1 (out of 1 available) 30582 1726855299.44763: exiting _queue_task() for managed_node3/ping 30582 1726855299.44776: done queuing things up, now waiting for results queue to drain 30582 1726855299.44778: waiting for pending results... 30582 1726855299.45211: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855299.45277: in run() - task 0affcc66-ac2b-aa83-7d57-000000000b48 30582 1726855299.45328: variable 'ansible_search_path' from source: unknown 30582 1726855299.45332: variable 'ansible_search_path' from source: unknown 30582 1726855299.45437: calling self._execute() 30582 1726855299.45456: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.45468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.45490: variable 'omit' from source: magic vars 30582 1726855299.45886: variable 'ansible_distribution_major_version' from source: facts 30582 1726855299.45905: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855299.45917: variable 'omit' from source: magic vars 30582 1726855299.45989: variable 'omit' from source: magic vars 30582 1726855299.46027: variable 'omit' from source: magic vars 30582 1726855299.46070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855299.46126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855299.46151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855299.46178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855299.46301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855299.46306: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855299.46309: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.46311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.46399: Set connection var ansible_timeout to 10 30582 1726855299.46408: Set connection var ansible_connection to ssh 30582 1726855299.46422: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855299.46436: Set connection var ansible_pipelining to False 30582 1726855299.46446: Set connection var ansible_shell_executable to /bin/sh 30582 1726855299.46453: Set connection var ansible_shell_type to sh 30582 1726855299.46484: variable 'ansible_shell_executable' from source: unknown 30582 1726855299.46538: variable 'ansible_connection' from source: unknown 30582 1726855299.46541: variable 'ansible_module_compression' from source: unknown 30582 1726855299.46543: variable 'ansible_shell_type' from source: unknown 30582 1726855299.46546: variable 'ansible_shell_executable' from source: unknown 30582 1726855299.46548: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.46551: variable 'ansible_pipelining' from source: unknown 30582 1726855299.46553: variable 'ansible_timeout' from source: unknown 30582 1726855299.46555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.46756: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855299.46781: variable 'omit' from source: magic vars 30582 1726855299.46793: starting attempt loop 30582 1726855299.46864: running the handler 30582 1726855299.46868: _low_level_execute_command(): starting 30582 1726855299.46870: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855299.47552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855299.47595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855299.47612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855299.47633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855299.47694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855299.47766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855299.47792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855299.47824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855299.47917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855299.49624: stdout chunk (state=3): >>>/root <<< 30582 1726855299.49768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855299.49772: stdout chunk (state=3): >>><<< 30582 1726855299.49780: stderr chunk (state=3): >>><<< 30582 1726855299.49812: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855299.49877: _low_level_execute_command(): starting 30582 1726855299.49881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180 `" && echo ansible-tmp-1726855299.498206-32254-272874517062180="` echo /root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180 `" ) && sleep 0' 30582 1726855299.50470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855299.50485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855299.50543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855299.50618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855299.50651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855299.50689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855299.50748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855299.52626: stdout chunk (state=3): >>>ansible-tmp-1726855299.498206-32254-272874517062180=/root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180 <<< 30582 1726855299.52777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855299.52781: stdout chunk (state=3): >>><<< 30582 1726855299.52783: stderr chunk (state=3): >>><<< 30582 1726855299.52802: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855299.498206-32254-272874517062180=/root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855299.52931: variable 'ansible_module_compression' from source: unknown 30582 1726855299.52934: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855299.52948: variable 'ansible_facts' from source: unknown 30582 1726855299.53035: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/AnsiballZ_ping.py 30582 1726855299.53175: Sending initial data 30582 1726855299.53278: Sent initial data (152 bytes) 30582 1726855299.53795: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855299.53810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855299.53828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855299.53905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855299.53946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855299.53965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855299.53990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855299.54096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855299.55636: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855299.55660: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855299.55711: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855299.55780: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmptuvc03zw /root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/AnsiballZ_ping.py <<< 30582 1726855299.55783: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/AnsiballZ_ping.py" <<< 30582 1726855299.55831: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmptuvc03zw" to remote "/root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/AnsiballZ_ping.py" <<< 30582 1726855299.56498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855299.56534: stderr chunk (state=3): >>><<< 30582 1726855299.56658: stdout chunk (state=3): >>><<< 30582 1726855299.56662: done transferring module to remote 30582 1726855299.56664: _low_level_execute_command(): starting 30582 1726855299.56667: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/ /root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/AnsiballZ_ping.py && sleep 0' 30582 1726855299.57243: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855299.57282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855299.57290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855299.57381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855299.59132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855299.59157: stderr chunk (state=3): >>><<< 30582 1726855299.59160: stdout chunk (state=3): >>><<< 30582 1726855299.59179: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855299.59182: _low_level_execute_command(): starting 30582 1726855299.59186: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/AnsiballZ_ping.py && sleep 0' 30582 1726855299.59586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855299.59652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855299.59655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855299.59658: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855299.59661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855299.59716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855299.59776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855299.74995: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855299.76499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855299.76503: stdout chunk (state=3): >>><<< 30582 1726855299.76506: stderr chunk (state=3): >>><<< 30582 1726855299.76509: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855299.76512: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855299.76514: _low_level_execute_command(): starting 30582 1726855299.76517: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855299.498206-32254-272874517062180/ > /dev/null 2>&1 && sleep 0' 30582 1726855299.77807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855299.77900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855299.77954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855299.78001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855299.78213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855299.78251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855299.78372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855299.80255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855299.80334: stdout chunk (state=3): >>><<< 30582 1726855299.80338: stderr chunk (state=3): >>><<< 30582 1726855299.80396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855299.80413: handler run complete 30582 1726855299.80500: attempt loop complete, returning result 30582 1726855299.80503: _execute() done 30582 1726855299.80506: dumping result to json 30582 1726855299.80508: done dumping result, returning 30582 1726855299.80510: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-000000000b48] 30582 1726855299.80512: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b48 30582 1726855299.80589: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000b48 30582 1726855299.80593: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855299.80668: no more pending results, returning what we have 30582 1726855299.80675: results queue empty 30582 1726855299.80677: checking for any_errors_fatal 30582 1726855299.80682: done checking for any_errors_fatal 30582 1726855299.80683: checking for max_fail_percentage 30582 1726855299.80685: done checking for max_fail_percentage 30582 1726855299.80686: checking to see if all hosts have failed and the running result is not ok 30582 1726855299.80689: done checking to see if all hosts have failed 30582 1726855299.80690: getting the remaining hosts for this loop 30582 1726855299.80691: done getting the remaining hosts for this loop 30582 1726855299.80695: getting the next task for host managed_node3 30582 1726855299.80708: done getting next task for host managed_node3 30582 1726855299.80710: ^ task is: TASK: meta (role_complete) 30582 1726855299.80716: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855299.80729: getting variables 30582 1726855299.80732: in VariableManager get_vars() 30582 1726855299.80776: Calling all_inventory to load vars for managed_node3 30582 1726855299.80779: Calling groups_inventory to load vars for managed_node3 30582 1726855299.80782: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855299.81296: Calling all_plugins_play to load vars for managed_node3 30582 1726855299.81300: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855299.81304: Calling groups_plugins_play to load vars for managed_node3 30582 1726855299.83676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855299.85467: done with get_vars() 30582 1726855299.86402: done getting variables 30582 1726855299.86494: done queuing things up, now waiting for results queue to drain 30582 1726855299.86497: results queue empty 30582 1726855299.86498: checking for any_errors_fatal 30582 1726855299.86501: done checking for any_errors_fatal 30582 1726855299.86502: checking for max_fail_percentage 30582 1726855299.86503: done checking for max_fail_percentage 30582 1726855299.86504: checking to see if all hosts have failed and the running result is not ok 30582 1726855299.86505: done checking to see if all hosts have failed 30582 1726855299.86505: getting the remaining hosts for this loop 30582 1726855299.86506: done getting the remaining hosts for this loop 30582 1726855299.86509: getting the next task for host managed_node3 30582 1726855299.86514: done getting next task for host managed_node3 30582 1726855299.86517: ^ task is: TASK: Show result 30582 1726855299.86519: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855299.86522: getting variables 30582 1726855299.86523: in VariableManager get_vars() 30582 1726855299.86534: Calling all_inventory to load vars for managed_node3 30582 1726855299.86537: Calling groups_inventory to load vars for managed_node3 30582 1726855299.86539: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855299.86544: Calling all_plugins_play to load vars for managed_node3 30582 1726855299.86546: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855299.86549: Calling groups_plugins_play to load vars for managed_node3 30582 1726855299.88602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855299.90284: done with get_vars() 30582 1726855299.90319: done getting variables 30582 1726855299.90372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 14:01:39 -0400 (0:00:00.460) 0:00:36.254 ****** 30582 1726855299.90413: entering _queue_task() for managed_node3/debug 30582 1726855299.90905: worker is 1 (out of 1 available) 30582 1726855299.90917: exiting _queue_task() for managed_node3/debug 30582 1726855299.90929: done queuing things up, now waiting for results queue to drain 30582 1726855299.90931: waiting for pending results... 30582 1726855299.91410: running TaskExecutor() for managed_node3/TASK: Show result 30582 1726855299.91454: in run() - task 0affcc66-ac2b-aa83-7d57-000000000ad2 30582 1726855299.91497: variable 'ansible_search_path' from source: unknown 30582 1726855299.91501: variable 'ansible_search_path' from source: unknown 30582 1726855299.91539: calling self._execute() 30582 1726855299.91667: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.91671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.91677: variable 'omit' from source: magic vars 30582 1726855299.92268: variable 'ansible_distribution_major_version' from source: facts 30582 1726855299.92693: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855299.92697: variable 'omit' from source: magic vars 30582 1726855299.92700: variable 'omit' from source: magic vars 30582 1726855299.92702: variable 'omit' from source: magic vars 30582 1726855299.92704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855299.92708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855299.92710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855299.92712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855299.92836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855299.92871: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855299.92921: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.92931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.93352: Set connection var ansible_timeout to 10 30582 1726855299.93460: Set connection var ansible_connection to ssh 30582 1726855299.93464: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855299.93467: Set connection var ansible_pipelining to False 30582 1726855299.93469: Set connection var ansible_shell_executable to /bin/sh 30582 1726855299.93471: Set connection var ansible_shell_type to sh 30582 1726855299.93475: variable 'ansible_shell_executable' from source: unknown 30582 1726855299.93478: variable 'ansible_connection' from source: unknown 30582 1726855299.93480: variable 'ansible_module_compression' from source: unknown 30582 1726855299.93482: variable 'ansible_shell_type' from source: unknown 30582 1726855299.93484: variable 'ansible_shell_executable' from source: unknown 30582 1726855299.93486: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855299.93490: variable 'ansible_pipelining' from source: unknown 30582 1726855299.93492: variable 'ansible_timeout' from source: unknown 30582 1726855299.93494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855299.93616: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855299.93700: variable 'omit' from source: magic vars 30582 1726855299.93894: starting attempt loop 30582 1726855299.93898: running the handler 30582 1726855299.93900: variable '__network_connections_result' from source: set_fact 30582 1726855299.94035: variable '__network_connections_result' from source: set_fact 30582 1726855299.94279: handler run complete 30582 1726855299.94358: attempt loop complete, returning result 30582 1726855299.94366: _execute() done 30582 1726855299.94372: dumping result to json 30582 1726855299.94385: done dumping result, returning 30582 1726855299.94450: done running TaskExecutor() for managed_node3/TASK: Show result [0affcc66-ac2b-aa83-7d57-000000000ad2] 30582 1726855299.94461: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000ad2 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508" ] } } 30582 1726855299.94767: no more pending results, returning what we have 30582 1726855299.94772: results queue empty 30582 1726855299.94776: checking for any_errors_fatal 30582 1726855299.94778: done checking for any_errors_fatal 30582 1726855299.94779: checking for max_fail_percentage 30582 1726855299.94781: done checking for max_fail_percentage 30582 1726855299.94782: checking to see if all hosts have failed and the running result is not ok 30582 1726855299.94783: done checking to see if all hosts have failed 30582 1726855299.94783: getting the remaining hosts for this loop 30582 1726855299.94785: done getting the remaining hosts for this loop 30582 1726855299.94791: getting the next task for host managed_node3 30582 1726855299.94801: done getting next task for host managed_node3 30582 1726855299.94805: ^ task is: TASK: Test 30582 1726855299.94808: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855299.94814: getting variables 30582 1726855299.94816: in VariableManager get_vars() 30582 1726855299.94848: Calling all_inventory to load vars for managed_node3 30582 1726855299.94851: Calling groups_inventory to load vars for managed_node3 30582 1726855299.94855: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855299.94866: Calling all_plugins_play to load vars for managed_node3 30582 1726855299.94870: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855299.94875: Calling groups_plugins_play to load vars for managed_node3 30582 1726855299.95509: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000ad2 30582 1726855299.95513: WORKER PROCESS EXITING 30582 1726855299.97107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855299.99429: done with get_vars() 30582 1726855299.99465: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 14:01:39 -0400 (0:00:00.091) 0:00:36.345 ****** 30582 1726855299.99580: entering _queue_task() for managed_node3/include_tasks 30582 1726855299.99955: worker is 1 (out of 1 available) 30582 1726855299.99970: exiting _queue_task() for managed_node3/include_tasks 30582 1726855299.99983: done queuing things up, now waiting for results queue to drain 30582 1726855299.99985: waiting for pending results... 30582 1726855300.00319: running TaskExecutor() for managed_node3/TASK: Test 30582 1726855300.00441: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a4d 30582 1726855300.00462: variable 'ansible_search_path' from source: unknown 30582 1726855300.00471: variable 'ansible_search_path' from source: unknown 30582 1726855300.00530: variable 'lsr_test' from source: include params 30582 1726855300.00757: variable 'lsr_test' from source: include params 30582 1726855300.00831: variable 'omit' from source: magic vars 30582 1726855300.00986: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855300.01006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855300.01022: variable 'omit' from source: magic vars 30582 1726855300.01283: variable 'ansible_distribution_major_version' from source: facts 30582 1726855300.01305: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855300.01396: variable 'item' from source: unknown 30582 1726855300.01400: variable 'item' from source: unknown 30582 1726855300.01427: variable 'item' from source: unknown 30582 1726855300.01495: variable 'item' from source: unknown 30582 1726855300.01996: dumping result to json 30582 1726855300.02000: done dumping result, returning 30582 1726855300.02003: done running TaskExecutor() for managed_node3/TASK: Test [0affcc66-ac2b-aa83-7d57-000000000a4d] 30582 1726855300.02005: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4d 30582 1726855300.02048: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4d 30582 1726855300.02051: WORKER PROCESS EXITING 30582 1726855300.02122: no more pending results, returning what we have 30582 1726855300.02127: in VariableManager get_vars() 30582 1726855300.02159: Calling all_inventory to load vars for managed_node3 30582 1726855300.02161: Calling groups_inventory to load vars for managed_node3 30582 1726855300.02164: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855300.02176: Calling all_plugins_play to load vars for managed_node3 30582 1726855300.02179: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855300.02181: Calling groups_plugins_play to load vars for managed_node3 30582 1726855300.03684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855300.05370: done with get_vars() 30582 1726855300.05402: variable 'ansible_search_path' from source: unknown 30582 1726855300.05403: variable 'ansible_search_path' from source: unknown 30582 1726855300.05459: we have included files to process 30582 1726855300.05460: generating all_blocks data 30582 1726855300.05471: done generating all_blocks data 30582 1726855300.05479: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855300.05481: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855300.05484: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855300.05689: done processing included file 30582 1726855300.05691: iterating over new_blocks loaded from include file 30582 1726855300.05693: in VariableManager get_vars() 30582 1726855300.05709: done with get_vars() 30582 1726855300.05711: filtering new block on tags 30582 1726855300.05737: done filtering new block on tags 30582 1726855300.05740: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node3 => (item=tasks/activate_profile.yml) 30582 1726855300.05744: extending task lists for all hosts with included blocks 30582 1726855300.07395: done extending task lists 30582 1726855300.07397: done processing included files 30582 1726855300.07398: results queue empty 30582 1726855300.07398: checking for any_errors_fatal 30582 1726855300.07403: done checking for any_errors_fatal 30582 1726855300.07404: checking for max_fail_percentage 30582 1726855300.07405: done checking for max_fail_percentage 30582 1726855300.07406: checking to see if all hosts have failed and the running result is not ok 30582 1726855300.07407: done checking to see if all hosts have failed 30582 1726855300.07408: getting the remaining hosts for this loop 30582 1726855300.07409: done getting the remaining hosts for this loop 30582 1726855300.07412: getting the next task for host managed_node3 30582 1726855300.07417: done getting next task for host managed_node3 30582 1726855300.07419: ^ task is: TASK: Include network role 30582 1726855300.07421: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855300.07424: getting variables 30582 1726855300.07425: in VariableManager get_vars() 30582 1726855300.07438: Calling all_inventory to load vars for managed_node3 30582 1726855300.07440: Calling groups_inventory to load vars for managed_node3 30582 1726855300.07443: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855300.07450: Calling all_plugins_play to load vars for managed_node3 30582 1726855300.07452: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855300.07455: Calling groups_plugins_play to load vars for managed_node3 30582 1726855300.09969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855300.12430: done with get_vars() 30582 1726855300.12453: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 14:01:40 -0400 (0:00:00.129) 0:00:36.475 ****** 30582 1726855300.12556: entering _queue_task() for managed_node3/include_role 30582 1726855300.13067: worker is 1 (out of 1 available) 30582 1726855300.13082: exiting _queue_task() for managed_node3/include_role 30582 1726855300.13495: done queuing things up, now waiting for results queue to drain 30582 1726855300.13497: waiting for pending results... 30582 1726855300.13705: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855300.14092: in run() - task 0affcc66-ac2b-aa83-7d57-000000000caa 30582 1726855300.14295: variable 'ansible_search_path' from source: unknown 30582 1726855300.14298: variable 'ansible_search_path' from source: unknown 30582 1726855300.14301: calling self._execute() 30582 1726855300.14303: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855300.14306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855300.14309: variable 'omit' from source: magic vars 30582 1726855300.15244: variable 'ansible_distribution_major_version' from source: facts 30582 1726855300.15264: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855300.15279: _execute() done 30582 1726855300.15303: dumping result to json 30582 1726855300.15317: done dumping result, returning 30582 1726855300.15329: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-000000000caa] 30582 1726855300.15339: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000caa 30582 1726855300.15793: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000caa 30582 1726855300.15798: WORKER PROCESS EXITING 30582 1726855300.15823: no more pending results, returning what we have 30582 1726855300.15828: in VariableManager get_vars() 30582 1726855300.15862: Calling all_inventory to load vars for managed_node3 30582 1726855300.15865: Calling groups_inventory to load vars for managed_node3 30582 1726855300.15869: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855300.15882: Calling all_plugins_play to load vars for managed_node3 30582 1726855300.15885: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855300.15890: Calling groups_plugins_play to load vars for managed_node3 30582 1726855300.17322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855300.19908: done with get_vars() 30582 1726855300.19936: variable 'ansible_search_path' from source: unknown 30582 1726855300.19937: variable 'ansible_search_path' from source: unknown 30582 1726855300.20105: variable 'omit' from source: magic vars 30582 1726855300.20151: variable 'omit' from source: magic vars 30582 1726855300.20171: variable 'omit' from source: magic vars 30582 1726855300.20177: we have included files to process 30582 1726855300.20178: generating all_blocks data 30582 1726855300.20180: done generating all_blocks data 30582 1726855300.20181: processing included file: fedora.linux_system_roles.network 30582 1726855300.20204: in VariableManager get_vars() 30582 1726855300.20219: done with get_vars() 30582 1726855300.20247: in VariableManager get_vars() 30582 1726855300.20265: done with get_vars() 30582 1726855300.20313: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855300.20444: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855300.20536: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855300.21039: in VariableManager get_vars() 30582 1726855300.21062: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855300.24200: iterating over new_blocks loaded from include file 30582 1726855300.24203: in VariableManager get_vars() 30582 1726855300.24225: done with get_vars() 30582 1726855300.24227: filtering new block on tags 30582 1726855300.24532: done filtering new block on tags 30582 1726855300.24536: in VariableManager get_vars() 30582 1726855300.24553: done with get_vars() 30582 1726855300.24554: filtering new block on tags 30582 1726855300.24571: done filtering new block on tags 30582 1726855300.24576: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855300.24581: extending task lists for all hosts with included blocks 30582 1726855300.24708: done extending task lists 30582 1726855300.24709: done processing included files 30582 1726855300.24710: results queue empty 30582 1726855300.24711: checking for any_errors_fatal 30582 1726855300.24715: done checking for any_errors_fatal 30582 1726855300.24716: checking for max_fail_percentage 30582 1726855300.24717: done checking for max_fail_percentage 30582 1726855300.24718: checking to see if all hosts have failed and the running result is not ok 30582 1726855300.24719: done checking to see if all hosts have failed 30582 1726855300.24720: getting the remaining hosts for this loop 30582 1726855300.24721: done getting the remaining hosts for this loop 30582 1726855300.24724: getting the next task for host managed_node3 30582 1726855300.24728: done getting next task for host managed_node3 30582 1726855300.24735: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855300.24739: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855300.24750: getting variables 30582 1726855300.24751: in VariableManager get_vars() 30582 1726855300.24764: Calling all_inventory to load vars for managed_node3 30582 1726855300.24766: Calling groups_inventory to load vars for managed_node3 30582 1726855300.24769: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855300.24777: Calling all_plugins_play to load vars for managed_node3 30582 1726855300.24780: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855300.24783: Calling groups_plugins_play to load vars for managed_node3 30582 1726855300.26113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855300.28762: done with get_vars() 30582 1726855300.28799: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:01:40 -0400 (0:00:00.163) 0:00:36.638 ****** 30582 1726855300.28891: entering _queue_task() for managed_node3/include_tasks 30582 1726855300.29324: worker is 1 (out of 1 available) 30582 1726855300.29337: exiting _queue_task() for managed_node3/include_tasks 30582 1726855300.29348: done queuing things up, now waiting for results queue to drain 30582 1726855300.29350: waiting for pending results... 30582 1726855300.29642: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855300.29808: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d16 30582 1726855300.29832: variable 'ansible_search_path' from source: unknown 30582 1726855300.29841: variable 'ansible_search_path' from source: unknown 30582 1726855300.29890: calling self._execute() 30582 1726855300.29993: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855300.30022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855300.30025: variable 'omit' from source: magic vars 30582 1726855300.30458: variable 'ansible_distribution_major_version' from source: facts 30582 1726855300.30462: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855300.30465: _execute() done 30582 1726855300.30468: dumping result to json 30582 1726855300.30470: done dumping result, returning 30582 1726855300.30473: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-000000000d16] 30582 1726855300.30484: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d16 30582 1726855300.30752: no more pending results, returning what we have 30582 1726855300.30759: in VariableManager get_vars() 30582 1726855300.30810: Calling all_inventory to load vars for managed_node3 30582 1726855300.30814: Calling groups_inventory to load vars for managed_node3 30582 1726855300.30816: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855300.30831: Calling all_plugins_play to load vars for managed_node3 30582 1726855300.30835: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855300.30838: Calling groups_plugins_play to load vars for managed_node3 30582 1726855300.31595: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d16 30582 1726855300.31599: WORKER PROCESS EXITING 30582 1726855300.33377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855300.35190: done with get_vars() 30582 1726855300.35407: variable 'ansible_search_path' from source: unknown 30582 1726855300.35408: variable 'ansible_search_path' from source: unknown 30582 1726855300.35444: we have included files to process 30582 1726855300.35445: generating all_blocks data 30582 1726855300.35447: done generating all_blocks data 30582 1726855300.35450: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855300.35450: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855300.35452: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855300.36605: done processing included file 30582 1726855300.36607: iterating over new_blocks loaded from include file 30582 1726855300.36609: in VariableManager get_vars() 30582 1726855300.36633: done with get_vars() 30582 1726855300.36635: filtering new block on tags 30582 1726855300.36665: done filtering new block on tags 30582 1726855300.36668: in VariableManager get_vars() 30582 1726855300.36896: done with get_vars() 30582 1726855300.36898: filtering new block on tags 30582 1726855300.36947: done filtering new block on tags 30582 1726855300.36951: in VariableManager get_vars() 30582 1726855300.36973: done with get_vars() 30582 1726855300.36977: filtering new block on tags 30582 1726855300.37020: done filtering new block on tags 30582 1726855300.37022: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855300.37027: extending task lists for all hosts with included blocks 30582 1726855300.40466: done extending task lists 30582 1726855300.40468: done processing included files 30582 1726855300.40469: results queue empty 30582 1726855300.40469: checking for any_errors_fatal 30582 1726855300.40473: done checking for any_errors_fatal 30582 1726855300.40476: checking for max_fail_percentage 30582 1726855300.40478: done checking for max_fail_percentage 30582 1726855300.40478: checking to see if all hosts have failed and the running result is not ok 30582 1726855300.40479: done checking to see if all hosts have failed 30582 1726855300.40480: getting the remaining hosts for this loop 30582 1726855300.40481: done getting the remaining hosts for this loop 30582 1726855300.40484: getting the next task for host managed_node3 30582 1726855300.40492: done getting next task for host managed_node3 30582 1726855300.40495: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855300.40499: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855300.40510: getting variables 30582 1726855300.40511: in VariableManager get_vars() 30582 1726855300.40529: Calling all_inventory to load vars for managed_node3 30582 1726855300.40531: Calling groups_inventory to load vars for managed_node3 30582 1726855300.40534: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855300.40539: Calling all_plugins_play to load vars for managed_node3 30582 1726855300.40542: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855300.40545: Calling groups_plugins_play to load vars for managed_node3 30582 1726855300.42230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855300.43906: done with get_vars() 30582 1726855300.43939: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:01:40 -0400 (0:00:00.151) 0:00:36.790 ****** 30582 1726855300.44033: entering _queue_task() for managed_node3/setup 30582 1726855300.44557: worker is 1 (out of 1 available) 30582 1726855300.44570: exiting _queue_task() for managed_node3/setup 30582 1726855300.44584: done queuing things up, now waiting for results queue to drain 30582 1726855300.44586: waiting for pending results... 30582 1726855300.44810: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855300.45004: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d6d 30582 1726855300.45027: variable 'ansible_search_path' from source: unknown 30582 1726855300.45040: variable 'ansible_search_path' from source: unknown 30582 1726855300.45193: calling self._execute() 30582 1726855300.45197: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855300.45200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855300.45215: variable 'omit' from source: magic vars 30582 1726855300.45628: variable 'ansible_distribution_major_version' from source: facts 30582 1726855300.45649: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855300.45885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855300.48420: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855300.48504: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855300.48547: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855300.48600: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855300.48632: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855300.48778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855300.48781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855300.48803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855300.49092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855300.49095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855300.49292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855300.49296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855300.49298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855300.49300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855300.49302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855300.49485: variable '__network_required_facts' from source: role '' defaults 30582 1726855300.49537: variable 'ansible_facts' from source: unknown 30582 1726855300.51267: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855300.51272: when evaluation is False, skipping this task 30582 1726855300.51278: _execute() done 30582 1726855300.51281: dumping result to json 30582 1726855300.51284: done dumping result, returning 30582 1726855300.51289: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-000000000d6d] 30582 1726855300.51291: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d6d 30582 1726855300.51364: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d6d 30582 1726855300.51370: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855300.51432: no more pending results, returning what we have 30582 1726855300.51437: results queue empty 30582 1726855300.51439: checking for any_errors_fatal 30582 1726855300.51440: done checking for any_errors_fatal 30582 1726855300.51441: checking for max_fail_percentage 30582 1726855300.51443: done checking for max_fail_percentage 30582 1726855300.51444: checking to see if all hosts have failed and the running result is not ok 30582 1726855300.51445: done checking to see if all hosts have failed 30582 1726855300.51446: getting the remaining hosts for this loop 30582 1726855300.51447: done getting the remaining hosts for this loop 30582 1726855300.51452: getting the next task for host managed_node3 30582 1726855300.51466: done getting next task for host managed_node3 30582 1726855300.51470: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855300.51479: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855300.51503: getting variables 30582 1726855300.51505: in VariableManager get_vars() 30582 1726855300.51546: Calling all_inventory to load vars for managed_node3 30582 1726855300.51549: Calling groups_inventory to load vars for managed_node3 30582 1726855300.51551: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855300.51563: Calling all_plugins_play to load vars for managed_node3 30582 1726855300.51567: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855300.51578: Calling groups_plugins_play to load vars for managed_node3 30582 1726855300.55310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855300.58851: done with get_vars() 30582 1726855300.58886: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:01:40 -0400 (0:00:00.150) 0:00:36.940 ****** 30582 1726855300.59237: entering _queue_task() for managed_node3/stat 30582 1726855300.59894: worker is 1 (out of 1 available) 30582 1726855300.59909: exiting _queue_task() for managed_node3/stat 30582 1726855300.59923: done queuing things up, now waiting for results queue to drain 30582 1726855300.59925: waiting for pending results... 30582 1726855300.60500: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855300.60829: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d6f 30582 1726855300.60861: variable 'ansible_search_path' from source: unknown 30582 1726855300.60863: variable 'ansible_search_path' from source: unknown 30582 1726855300.60872: calling self._execute() 30582 1726855300.61077: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855300.61081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855300.61084: variable 'omit' from source: magic vars 30582 1726855300.61751: variable 'ansible_distribution_major_version' from source: facts 30582 1726855300.61780: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855300.62128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855300.62807: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855300.62856: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855300.62891: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855300.62929: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855300.63219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855300.63246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855300.63271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855300.63298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855300.63386: variable '__network_is_ostree' from source: set_fact 30582 1726855300.63600: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855300.63603: when evaluation is False, skipping this task 30582 1726855300.63606: _execute() done 30582 1726855300.63609: dumping result to json 30582 1726855300.63611: done dumping result, returning 30582 1726855300.63622: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-000000000d6f] 30582 1726855300.63627: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d6f skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855300.63897: no more pending results, returning what we have 30582 1726855300.63902: results queue empty 30582 1726855300.63904: checking for any_errors_fatal 30582 1726855300.63912: done checking for any_errors_fatal 30582 1726855300.63913: checking for max_fail_percentage 30582 1726855300.63915: done checking for max_fail_percentage 30582 1726855300.63916: checking to see if all hosts have failed and the running result is not ok 30582 1726855300.63917: done checking to see if all hosts have failed 30582 1726855300.63917: getting the remaining hosts for this loop 30582 1726855300.63919: done getting the remaining hosts for this loop 30582 1726855300.63923: getting the next task for host managed_node3 30582 1726855300.63933: done getting next task for host managed_node3 30582 1726855300.63938: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855300.63943: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855300.64202: getting variables 30582 1726855300.64204: in VariableManager get_vars() 30582 1726855300.64241: Calling all_inventory to load vars for managed_node3 30582 1726855300.64244: Calling groups_inventory to load vars for managed_node3 30582 1726855300.64246: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855300.64255: Calling all_plugins_play to load vars for managed_node3 30582 1726855300.64258: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855300.64260: Calling groups_plugins_play to load vars for managed_node3 30582 1726855300.64804: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d6f 30582 1726855300.64808: WORKER PROCESS EXITING 30582 1726855300.67118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855300.69281: done with get_vars() 30582 1726855300.69320: done getting variables 30582 1726855300.69380: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:01:40 -0400 (0:00:00.103) 0:00:37.044 ****** 30582 1726855300.69427: entering _queue_task() for managed_node3/set_fact 30582 1726855300.69815: worker is 1 (out of 1 available) 30582 1726855300.69830: exiting _queue_task() for managed_node3/set_fact 30582 1726855300.69842: done queuing things up, now waiting for results queue to drain 30582 1726855300.69843: waiting for pending results... 30582 1726855300.70405: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855300.70434: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d70 30582 1726855300.70449: variable 'ansible_search_path' from source: unknown 30582 1726855300.70452: variable 'ansible_search_path' from source: unknown 30582 1726855300.70490: calling self._execute() 30582 1726855300.70583: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855300.70590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855300.70600: variable 'omit' from source: magic vars 30582 1726855300.71012: variable 'ansible_distribution_major_version' from source: facts 30582 1726855300.71023: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855300.71394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855300.71521: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855300.71563: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855300.71608: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855300.71643: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855300.71746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855300.71770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855300.71797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855300.71832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855300.71917: variable '__network_is_ostree' from source: set_fact 30582 1726855300.71936: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855300.71939: when evaluation is False, skipping this task 30582 1726855300.71941: _execute() done 30582 1726855300.71944: dumping result to json 30582 1726855300.71948: done dumping result, returning 30582 1726855300.71958: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-000000000d70] 30582 1726855300.71963: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d70 30582 1726855300.72056: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d70 30582 1726855300.72060: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855300.72111: no more pending results, returning what we have 30582 1726855300.72116: results queue empty 30582 1726855300.72118: checking for any_errors_fatal 30582 1726855300.72125: done checking for any_errors_fatal 30582 1726855300.72126: checking for max_fail_percentage 30582 1726855300.72128: done checking for max_fail_percentage 30582 1726855300.72129: checking to see if all hosts have failed and the running result is not ok 30582 1726855300.72130: done checking to see if all hosts have failed 30582 1726855300.72133: getting the remaining hosts for this loop 30582 1726855300.72134: done getting the remaining hosts for this loop 30582 1726855300.72138: getting the next task for host managed_node3 30582 1726855300.72152: done getting next task for host managed_node3 30582 1726855300.72155: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855300.72162: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855300.72184: getting variables 30582 1726855300.72186: in VariableManager get_vars() 30582 1726855300.72232: Calling all_inventory to load vars for managed_node3 30582 1726855300.72235: Calling groups_inventory to load vars for managed_node3 30582 1726855300.72238: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855300.72250: Calling all_plugins_play to load vars for managed_node3 30582 1726855300.72254: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855300.72257: Calling groups_plugins_play to load vars for managed_node3 30582 1726855300.74270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855300.75905: done with get_vars() 30582 1726855300.75932: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:01:40 -0400 (0:00:00.066) 0:00:37.110 ****** 30582 1726855300.76048: entering _queue_task() for managed_node3/service_facts 30582 1726855300.76556: worker is 1 (out of 1 available) 30582 1726855300.76567: exiting _queue_task() for managed_node3/service_facts 30582 1726855300.76580: done queuing things up, now waiting for results queue to drain 30582 1726855300.76582: waiting for pending results... 30582 1726855300.76911: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855300.76981: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d72 30582 1726855300.77094: variable 'ansible_search_path' from source: unknown 30582 1726855300.77098: variable 'ansible_search_path' from source: unknown 30582 1726855300.77101: calling self._execute() 30582 1726855300.77151: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855300.77163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855300.77179: variable 'omit' from source: magic vars 30582 1726855300.77620: variable 'ansible_distribution_major_version' from source: facts 30582 1726855300.77623: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855300.77626: variable 'omit' from source: magic vars 30582 1726855300.77670: variable 'omit' from source: magic vars 30582 1726855300.77711: variable 'omit' from source: magic vars 30582 1726855300.77764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855300.77807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855300.77835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855300.77945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855300.77948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855300.77952: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855300.77954: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855300.77956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855300.78038: Set connection var ansible_timeout to 10 30582 1726855300.78050: Set connection var ansible_connection to ssh 30582 1726855300.78067: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855300.78081: Set connection var ansible_pipelining to False 30582 1726855300.78094: Set connection var ansible_shell_executable to /bin/sh 30582 1726855300.78102: Set connection var ansible_shell_type to sh 30582 1726855300.78129: variable 'ansible_shell_executable' from source: unknown 30582 1726855300.78138: variable 'ansible_connection' from source: unknown 30582 1726855300.78162: variable 'ansible_module_compression' from source: unknown 30582 1726855300.78165: variable 'ansible_shell_type' from source: unknown 30582 1726855300.78167: variable 'ansible_shell_executable' from source: unknown 30582 1726855300.78169: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855300.78171: variable 'ansible_pipelining' from source: unknown 30582 1726855300.78271: variable 'ansible_timeout' from source: unknown 30582 1726855300.78274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855300.78405: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855300.78423: variable 'omit' from source: magic vars 30582 1726855300.78433: starting attempt loop 30582 1726855300.78440: running the handler 30582 1726855300.78459: _low_level_execute_command(): starting 30582 1726855300.78473: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855300.79520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855300.79537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855300.79712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855300.79815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855300.80014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855300.81713: stdout chunk (state=3): >>>/root <<< 30582 1726855300.81869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855300.81873: stdout chunk (state=3): >>><<< 30582 1726855300.81875: stderr chunk (state=3): >>><<< 30582 1726855300.82115: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855300.82119: _low_level_execute_command(): starting 30582 1726855300.82122: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743 `" && echo ansible-tmp-1726855300.8191123-32316-227501663219743="` echo /root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743 `" ) && sleep 0' 30582 1726855300.83094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855300.83098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855300.83100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855300.83115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855300.83212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855300.83225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855300.83385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855300.83470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855300.85510: stdout chunk (state=3): >>>ansible-tmp-1726855300.8191123-32316-227501663219743=/root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743 <<< 30582 1726855300.85528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855300.85569: stderr chunk (state=3): >>><<< 30582 1726855300.85578: stdout chunk (state=3): >>><<< 30582 1726855300.85611: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855300.8191123-32316-227501663219743=/root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855300.85998: variable 'ansible_module_compression' from source: unknown 30582 1726855300.86002: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855300.86004: variable 'ansible_facts' from source: unknown 30582 1726855300.86132: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/AnsiballZ_service_facts.py 30582 1726855300.86371: Sending initial data 30582 1726855300.86400: Sent initial data (162 bytes) 30582 1726855300.87708: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855300.87767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855300.87864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855300.87879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855300.88014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855300.89822: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855300.89870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855300.90025: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpcuphtfgc /root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/AnsiballZ_service_facts.py <<< 30582 1726855300.90029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/AnsiballZ_service_facts.py" <<< 30582 1726855300.90075: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpcuphtfgc" to remote "/root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/AnsiballZ_service_facts.py" <<< 30582 1726855300.91656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855300.91674: stdout chunk (state=3): >>><<< 30582 1726855300.91703: stderr chunk (state=3): >>><<< 30582 1726855300.91962: done transferring module to remote 30582 1726855300.91965: _low_level_execute_command(): starting 30582 1726855300.91967: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/ /root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/AnsiballZ_service_facts.py && sleep 0' 30582 1726855300.93092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855300.93278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855300.93411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855300.93509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855300.95377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855300.95491: stderr chunk (state=3): >>><<< 30582 1726855300.95495: stdout chunk (state=3): >>><<< 30582 1726855300.95578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855300.95582: _low_level_execute_command(): starting 30582 1726855300.95584: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/AnsiballZ_service_facts.py && sleep 0' 30582 1726855300.97019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855300.97121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855300.97135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855300.97324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855302.48804: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30582 1726855302.48918: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855302.50568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855302.50576: stdout chunk (state=3): >>><<< 30582 1726855302.50579: stderr chunk (state=3): >>><<< 30582 1726855302.50584: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855302.52409: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855302.52413: _low_level_execute_command(): starting 30582 1726855302.52415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855300.8191123-32316-227501663219743/ > /dev/null 2>&1 && sleep 0' 30582 1726855302.53620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855302.53742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855302.53783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855302.53798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855302.53828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855302.54077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855302.55850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855302.55938: stderr chunk (state=3): >>><<< 30582 1726855302.55961: stdout chunk (state=3): >>><<< 30582 1726855302.55984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855302.56015: handler run complete 30582 1726855302.56547: variable 'ansible_facts' from source: unknown 30582 1726855302.56847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855302.58049: variable 'ansible_facts' from source: unknown 30582 1726855302.58397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855302.58748: attempt loop complete, returning result 30582 1726855302.58806: _execute() done 30582 1726855302.58815: dumping result to json 30582 1726855302.58962: done dumping result, returning 30582 1726855302.58981: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000000d72] 30582 1726855302.58996: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d72 30582 1726855302.61824: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d72 30582 1726855302.61829: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855302.62002: no more pending results, returning what we have 30582 1726855302.62006: results queue empty 30582 1726855302.62007: checking for any_errors_fatal 30582 1726855302.62011: done checking for any_errors_fatal 30582 1726855302.62012: checking for max_fail_percentage 30582 1726855302.62014: done checking for max_fail_percentage 30582 1726855302.62014: checking to see if all hosts have failed and the running result is not ok 30582 1726855302.62015: done checking to see if all hosts have failed 30582 1726855302.62016: getting the remaining hosts for this loop 30582 1726855302.62017: done getting the remaining hosts for this loop 30582 1726855302.62020: getting the next task for host managed_node3 30582 1726855302.62027: done getting next task for host managed_node3 30582 1726855302.62031: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855302.62036: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855302.62050: getting variables 30582 1726855302.62051: in VariableManager get_vars() 30582 1726855302.62082: Calling all_inventory to load vars for managed_node3 30582 1726855302.62085: Calling groups_inventory to load vars for managed_node3 30582 1726855302.62317: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855302.62328: Calling all_plugins_play to load vars for managed_node3 30582 1726855302.62332: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855302.62335: Calling groups_plugins_play to load vars for managed_node3 30582 1726855302.65279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855302.69171: done with get_vars() 30582 1726855302.69208: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:01:42 -0400 (0:00:01.934) 0:00:39.045 ****** 30582 1726855302.69654: entering _queue_task() for managed_node3/package_facts 30582 1726855302.70276: worker is 1 (out of 1 available) 30582 1726855302.70292: exiting _queue_task() for managed_node3/package_facts 30582 1726855302.70335: done queuing things up, now waiting for results queue to drain 30582 1726855302.70337: waiting for pending results... 30582 1726855302.71432: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855302.71438: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d73 30582 1726855302.71451: variable 'ansible_search_path' from source: unknown 30582 1726855302.71459: variable 'ansible_search_path' from source: unknown 30582 1726855302.71536: calling self._execute() 30582 1726855302.71796: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855302.71815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855302.71832: variable 'omit' from source: magic vars 30582 1726855302.72347: variable 'ansible_distribution_major_version' from source: facts 30582 1726855302.72364: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855302.72376: variable 'omit' from source: magic vars 30582 1726855302.72474: variable 'omit' from source: magic vars 30582 1726855302.72527: variable 'omit' from source: magic vars 30582 1726855302.72575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855302.72720: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855302.72725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855302.72734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855302.72738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855302.72750: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855302.72760: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855302.72768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855302.72900: Set connection var ansible_timeout to 10 30582 1726855302.72909: Set connection var ansible_connection to ssh 30582 1726855302.72922: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855302.72933: Set connection var ansible_pipelining to False 30582 1726855302.72953: Set connection var ansible_shell_executable to /bin/sh 30582 1726855302.72961: Set connection var ansible_shell_type to sh 30582 1726855302.72989: variable 'ansible_shell_executable' from source: unknown 30582 1726855302.72999: variable 'ansible_connection' from source: unknown 30582 1726855302.73007: variable 'ansible_module_compression' from source: unknown 30582 1726855302.73049: variable 'ansible_shell_type' from source: unknown 30582 1726855302.73052: variable 'ansible_shell_executable' from source: unknown 30582 1726855302.73062: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855302.73064: variable 'ansible_pipelining' from source: unknown 30582 1726855302.73066: variable 'ansible_timeout' from source: unknown 30582 1726855302.73069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855302.73355: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855302.73394: variable 'omit' from source: magic vars 30582 1726855302.73440: starting attempt loop 30582 1726855302.73443: running the handler 30582 1726855302.73446: _low_level_execute_command(): starting 30582 1726855302.73459: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855302.74785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855302.74815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855302.74922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855302.76633: stdout chunk (state=3): >>>/root <<< 30582 1726855302.76773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855302.76812: stderr chunk (state=3): >>><<< 30582 1726855302.76830: stdout chunk (state=3): >>><<< 30582 1726855302.76856: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855302.76894: _low_level_execute_command(): starting 30582 1726855302.76977: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800 `" && echo ansible-tmp-1726855302.7687125-32394-37624556190800="` echo /root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800 `" ) && sleep 0' 30582 1726855302.77827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855302.77831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855302.77834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855302.77872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855302.77948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855302.78197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855302.78456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855302.80511: stdout chunk (state=3): >>>ansible-tmp-1726855302.7687125-32394-37624556190800=/root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800 <<< 30582 1726855302.80515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855302.80594: stdout chunk (state=3): >>><<< 30582 1726855302.80598: stderr chunk (state=3): >>><<< 30582 1726855302.80601: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855302.7687125-32394-37624556190800=/root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855302.80604: variable 'ansible_module_compression' from source: unknown 30582 1726855302.80646: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855302.80951: variable 'ansible_facts' from source: unknown 30582 1726855302.81309: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/AnsiballZ_package_facts.py 30582 1726855302.81433: Sending initial data 30582 1726855302.81477: Sent initial data (161 bytes) 30582 1726855302.82547: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855302.82565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855302.82655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855302.82792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855302.82816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855302.82909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855302.85032: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855302.85613: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpm485qcdt /root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/AnsiballZ_package_facts.py <<< 30582 1726855302.85618: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/AnsiballZ_package_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpm485qcdt" to remote "/root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/AnsiballZ_package_facts.py" <<< 30582 1726855302.88772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855302.88776: stdout chunk (state=3): >>><<< 30582 1726855302.88778: stderr chunk (state=3): >>><<< 30582 1726855302.88790: done transferring module to remote 30582 1726855302.88805: _low_level_execute_command(): starting 30582 1726855302.88813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/ /root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/AnsiballZ_package_facts.py && sleep 0' 30582 1726855302.90296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855302.90301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855302.90307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855302.90319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855302.90578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855302.90630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855302.92506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855302.92510: stdout chunk (state=3): >>><<< 30582 1726855302.92571: stderr chunk (state=3): >>><<< 30582 1726855302.92612: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855302.92622: _low_level_execute_command(): starting 30582 1726855302.92625: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/AnsiballZ_package_facts.py && sleep 0' 30582 1726855302.93856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855302.94111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855302.94211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855302.94257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855302.94438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855303.38463: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30582 1726855303.38477: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30582 1726855303.38525: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 30582 1726855303.38542: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30582 1726855303.38564: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30582 1726855303.38580: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30582 1726855303.38611: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30582 1726855303.38627: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30582 1726855303.38654: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30582 1726855303.38663: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30582 1726855303.38666: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30582 1726855303.38686: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855303.40683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855303.40686: stdout chunk (state=3): >>><<< 30582 1726855303.40691: stderr chunk (state=3): >>><<< 30582 1726855303.40756: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855303.43238: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855303.43251: _low_level_execute_command(): starting 30582 1726855303.43254: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855302.7687125-32394-37624556190800/ > /dev/null 2>&1 && sleep 0' 30582 1726855303.43807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855303.43836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855303.43840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855303.43845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855303.43914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855303.45836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855303.45926: stderr chunk (state=3): >>><<< 30582 1726855303.45930: stdout chunk (state=3): >>><<< 30582 1726855303.45960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855303.46192: handler run complete 30582 1726855303.47489: variable 'ansible_facts' from source: unknown 30582 1726855303.48748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855303.51144: variable 'ansible_facts' from source: unknown 30582 1726855303.51401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855303.52143: attempt loop complete, returning result 30582 1726855303.52161: _execute() done 30582 1726855303.52164: dumping result to json 30582 1726855303.52379: done dumping result, returning 30582 1726855303.52391: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000000d73] 30582 1726855303.52403: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d73 30582 1726855303.55214: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d73 30582 1726855303.55219: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855303.55355: no more pending results, returning what we have 30582 1726855303.55358: results queue empty 30582 1726855303.55359: checking for any_errors_fatal 30582 1726855303.55364: done checking for any_errors_fatal 30582 1726855303.55364: checking for max_fail_percentage 30582 1726855303.55366: done checking for max_fail_percentage 30582 1726855303.55367: checking to see if all hosts have failed and the running result is not ok 30582 1726855303.55368: done checking to see if all hosts have failed 30582 1726855303.55369: getting the remaining hosts for this loop 30582 1726855303.55370: done getting the remaining hosts for this loop 30582 1726855303.55376: getting the next task for host managed_node3 30582 1726855303.55384: done getting next task for host managed_node3 30582 1726855303.55391: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855303.55396: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855303.55408: getting variables 30582 1726855303.55409: in VariableManager get_vars() 30582 1726855303.55438: Calling all_inventory to load vars for managed_node3 30582 1726855303.55441: Calling groups_inventory to load vars for managed_node3 30582 1726855303.55443: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855303.55452: Calling all_plugins_play to load vars for managed_node3 30582 1726855303.55455: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855303.55458: Calling groups_plugins_play to load vars for managed_node3 30582 1726855303.56971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855303.58712: done with get_vars() 30582 1726855303.58737: done getting variables 30582 1726855303.58800: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:01:43 -0400 (0:00:00.892) 0:00:39.938 ****** 30582 1726855303.58842: entering _queue_task() for managed_node3/debug 30582 1726855303.59197: worker is 1 (out of 1 available) 30582 1726855303.59211: exiting _queue_task() for managed_node3/debug 30582 1726855303.59224: done queuing things up, now waiting for results queue to drain 30582 1726855303.59226: waiting for pending results... 30582 1726855303.59607: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855303.59731: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d17 30582 1726855303.59748: variable 'ansible_search_path' from source: unknown 30582 1726855303.59753: variable 'ansible_search_path' from source: unknown 30582 1726855303.59791: calling self._execute() 30582 1726855303.59899: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855303.59903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855303.59966: variable 'omit' from source: magic vars 30582 1726855303.60329: variable 'ansible_distribution_major_version' from source: facts 30582 1726855303.60342: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855303.60345: variable 'omit' from source: magic vars 30582 1726855303.60463: variable 'omit' from source: magic vars 30582 1726855303.60813: variable 'network_provider' from source: set_fact 30582 1726855303.60816: variable 'omit' from source: magic vars 30582 1726855303.60818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855303.60821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855303.60824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855303.60826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855303.60828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855303.60995: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855303.60999: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855303.61003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855303.61137: Set connection var ansible_timeout to 10 30582 1726855303.61141: Set connection var ansible_connection to ssh 30582 1726855303.61147: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855303.61152: Set connection var ansible_pipelining to False 30582 1726855303.61157: Set connection var ansible_shell_executable to /bin/sh 30582 1726855303.61160: Set connection var ansible_shell_type to sh 30582 1726855303.61185: variable 'ansible_shell_executable' from source: unknown 30582 1726855303.61190: variable 'ansible_connection' from source: unknown 30582 1726855303.61193: variable 'ansible_module_compression' from source: unknown 30582 1726855303.61195: variable 'ansible_shell_type' from source: unknown 30582 1726855303.61198: variable 'ansible_shell_executable' from source: unknown 30582 1726855303.61200: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855303.61202: variable 'ansible_pipelining' from source: unknown 30582 1726855303.61206: variable 'ansible_timeout' from source: unknown 30582 1726855303.61209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855303.61410: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855303.61422: variable 'omit' from source: magic vars 30582 1726855303.61427: starting attempt loop 30582 1726855303.61431: running the handler 30582 1726855303.61485: handler run complete 30582 1726855303.61501: attempt loop complete, returning result 30582 1726855303.61504: _execute() done 30582 1726855303.61506: dumping result to json 30582 1726855303.61509: done dumping result, returning 30582 1726855303.61578: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-000000000d17] 30582 1726855303.61581: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d17 30582 1726855303.61640: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d17 30582 1726855303.61643: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855303.61742: no more pending results, returning what we have 30582 1726855303.61745: results queue empty 30582 1726855303.61746: checking for any_errors_fatal 30582 1726855303.61753: done checking for any_errors_fatal 30582 1726855303.61754: checking for max_fail_percentage 30582 1726855303.61755: done checking for max_fail_percentage 30582 1726855303.61756: checking to see if all hosts have failed and the running result is not ok 30582 1726855303.61757: done checking to see if all hosts have failed 30582 1726855303.61758: getting the remaining hosts for this loop 30582 1726855303.61759: done getting the remaining hosts for this loop 30582 1726855303.61762: getting the next task for host managed_node3 30582 1726855303.61770: done getting next task for host managed_node3 30582 1726855303.61773: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855303.61777: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855303.61790: getting variables 30582 1726855303.61792: in VariableManager get_vars() 30582 1726855303.61820: Calling all_inventory to load vars for managed_node3 30582 1726855303.61823: Calling groups_inventory to load vars for managed_node3 30582 1726855303.61824: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855303.61832: Calling all_plugins_play to load vars for managed_node3 30582 1726855303.61835: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855303.61837: Calling groups_plugins_play to load vars for managed_node3 30582 1726855303.63469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855303.65868: done with get_vars() 30582 1726855303.65900: done getting variables 30582 1726855303.65985: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:01:43 -0400 (0:00:00.072) 0:00:40.010 ****** 30582 1726855303.66046: entering _queue_task() for managed_node3/fail 30582 1726855303.66621: worker is 1 (out of 1 available) 30582 1726855303.66635: exiting _queue_task() for managed_node3/fail 30582 1726855303.66650: done queuing things up, now waiting for results queue to drain 30582 1726855303.66651: waiting for pending results... 30582 1726855303.67111: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855303.67118: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d18 30582 1726855303.67306: variable 'ansible_search_path' from source: unknown 30582 1726855303.67310: variable 'ansible_search_path' from source: unknown 30582 1726855303.67315: calling self._execute() 30582 1726855303.67318: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855303.67320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855303.67323: variable 'omit' from source: magic vars 30582 1726855303.67847: variable 'ansible_distribution_major_version' from source: facts 30582 1726855303.67890: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855303.68021: variable 'network_state' from source: role '' defaults 30582 1726855303.68033: Evaluated conditional (network_state != {}): False 30582 1726855303.68036: when evaluation is False, skipping this task 30582 1726855303.68039: _execute() done 30582 1726855303.68042: dumping result to json 30582 1726855303.68044: done dumping result, returning 30582 1726855303.68053: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-000000000d18] 30582 1726855303.68058: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d18 30582 1726855303.68190: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d18 30582 1726855303.68194: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855303.68268: no more pending results, returning what we have 30582 1726855303.68272: results queue empty 30582 1726855303.68273: checking for any_errors_fatal 30582 1726855303.68280: done checking for any_errors_fatal 30582 1726855303.68281: checking for max_fail_percentage 30582 1726855303.68283: done checking for max_fail_percentage 30582 1726855303.68284: checking to see if all hosts have failed and the running result is not ok 30582 1726855303.68285: done checking to see if all hosts have failed 30582 1726855303.68285: getting the remaining hosts for this loop 30582 1726855303.68291: done getting the remaining hosts for this loop 30582 1726855303.68301: getting the next task for host managed_node3 30582 1726855303.68315: done getting next task for host managed_node3 30582 1726855303.68319: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855303.68325: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855303.68348: getting variables 30582 1726855303.68350: in VariableManager get_vars() 30582 1726855303.68532: Calling all_inventory to load vars for managed_node3 30582 1726855303.68535: Calling groups_inventory to load vars for managed_node3 30582 1726855303.68538: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855303.68571: Calling all_plugins_play to load vars for managed_node3 30582 1726855303.68577: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855303.68581: Calling groups_plugins_play to load vars for managed_node3 30582 1726855303.79058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855303.82181: done with get_vars() 30582 1726855303.82214: done getting variables 30582 1726855303.82260: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:01:43 -0400 (0:00:00.162) 0:00:40.172 ****** 30582 1726855303.82301: entering _queue_task() for managed_node3/fail 30582 1726855303.82792: worker is 1 (out of 1 available) 30582 1726855303.82806: exiting _queue_task() for managed_node3/fail 30582 1726855303.82819: done queuing things up, now waiting for results queue to drain 30582 1726855303.82955: waiting for pending results... 30582 1726855303.83416: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855303.83483: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d19 30582 1726855303.83491: variable 'ansible_search_path' from source: unknown 30582 1726855303.83495: variable 'ansible_search_path' from source: unknown 30582 1726855303.83504: calling self._execute() 30582 1726855303.83649: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855303.83654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855303.83679: variable 'omit' from source: magic vars 30582 1726855303.84258: variable 'ansible_distribution_major_version' from source: facts 30582 1726855303.84262: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855303.84364: variable 'network_state' from source: role '' defaults 30582 1726855303.84505: Evaluated conditional (network_state != {}): False 30582 1726855303.84509: when evaluation is False, skipping this task 30582 1726855303.84513: _execute() done 30582 1726855303.84518: dumping result to json 30582 1726855303.84521: done dumping result, returning 30582 1726855303.84524: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-000000000d19] 30582 1726855303.84528: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d19 30582 1726855303.84612: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d19 30582 1726855303.84616: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855303.84681: no more pending results, returning what we have 30582 1726855303.84806: results queue empty 30582 1726855303.84808: checking for any_errors_fatal 30582 1726855303.84815: done checking for any_errors_fatal 30582 1726855303.84816: checking for max_fail_percentage 30582 1726855303.84818: done checking for max_fail_percentage 30582 1726855303.84819: checking to see if all hosts have failed and the running result is not ok 30582 1726855303.84820: done checking to see if all hosts have failed 30582 1726855303.84821: getting the remaining hosts for this loop 30582 1726855303.84822: done getting the remaining hosts for this loop 30582 1726855303.84826: getting the next task for host managed_node3 30582 1726855303.84835: done getting next task for host managed_node3 30582 1726855303.84839: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855303.84845: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855303.84868: getting variables 30582 1726855303.84870: in VariableManager get_vars() 30582 1726855303.85161: Calling all_inventory to load vars for managed_node3 30582 1726855303.85164: Calling groups_inventory to load vars for managed_node3 30582 1726855303.85166: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855303.85179: Calling all_plugins_play to load vars for managed_node3 30582 1726855303.85182: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855303.85185: Calling groups_plugins_play to load vars for managed_node3 30582 1726855303.86986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855303.88980: done with get_vars() 30582 1726855303.89008: done getting variables 30582 1726855303.89079: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:01:43 -0400 (0:00:00.068) 0:00:40.241 ****** 30582 1726855303.89118: entering _queue_task() for managed_node3/fail 30582 1726855303.89510: worker is 1 (out of 1 available) 30582 1726855303.89525: exiting _queue_task() for managed_node3/fail 30582 1726855303.89537: done queuing things up, now waiting for results queue to drain 30582 1726855303.89539: waiting for pending results... 30582 1726855303.89925: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855303.90123: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d1a 30582 1726855303.90128: variable 'ansible_search_path' from source: unknown 30582 1726855303.90131: variable 'ansible_search_path' from source: unknown 30582 1726855303.90134: calling self._execute() 30582 1726855303.90267: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855303.90314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855303.90337: variable 'omit' from source: magic vars 30582 1726855303.90879: variable 'ansible_distribution_major_version' from source: facts 30582 1726855303.90918: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855303.91076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855303.93425: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855303.93532: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855303.93557: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855303.93597: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855303.93640: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855303.93793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855303.93797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855303.93801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855303.93832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855303.93852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855303.93965: variable 'ansible_distribution_major_version' from source: facts 30582 1726855303.94029: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855303.94121: variable 'ansible_distribution' from source: facts 30582 1726855303.94136: variable '__network_rh_distros' from source: role '' defaults 30582 1726855303.94153: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855303.94420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855303.94448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855303.94484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855303.94574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855303.94577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855303.94605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855303.94633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855303.94713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855303.94753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855303.94774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855303.94828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855303.94897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855303.94900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855303.94927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855303.94947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855303.95297: variable 'network_connections' from source: include params 30582 1726855303.95342: variable 'interface' from source: play vars 30582 1726855303.95470: variable 'interface' from source: play vars 30582 1726855303.95511: variable 'network_state' from source: role '' defaults 30582 1726855303.95655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855303.95822: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855303.95866: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855303.95909: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855303.95940: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855303.95995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855303.96023: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855303.96096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855303.96099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855303.96293: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855303.96297: when evaluation is False, skipping this task 30582 1726855303.96299: _execute() done 30582 1726855303.96301: dumping result to json 30582 1726855303.96304: done dumping result, returning 30582 1726855303.96306: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-000000000d1a] 30582 1726855303.96308: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1a 30582 1726855303.96379: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1a 30582 1726855303.96383: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855303.96434: no more pending results, returning what we have 30582 1726855303.96439: results queue empty 30582 1726855303.96440: checking for any_errors_fatal 30582 1726855303.96445: done checking for any_errors_fatal 30582 1726855303.96446: checking for max_fail_percentage 30582 1726855303.96448: done checking for max_fail_percentage 30582 1726855303.96449: checking to see if all hosts have failed and the running result is not ok 30582 1726855303.96450: done checking to see if all hosts have failed 30582 1726855303.96451: getting the remaining hosts for this loop 30582 1726855303.96452: done getting the remaining hosts for this loop 30582 1726855303.96456: getting the next task for host managed_node3 30582 1726855303.96466: done getting next task for host managed_node3 30582 1726855303.96470: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855303.96475: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855303.96500: getting variables 30582 1726855303.96502: in VariableManager get_vars() 30582 1726855303.96541: Calling all_inventory to load vars for managed_node3 30582 1726855303.96544: Calling groups_inventory to load vars for managed_node3 30582 1726855303.96546: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855303.96559: Calling all_plugins_play to load vars for managed_node3 30582 1726855303.96562: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855303.96565: Calling groups_plugins_play to load vars for managed_node3 30582 1726855303.98965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855304.00668: done with get_vars() 30582 1726855304.00704: done getting variables 30582 1726855304.00771: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:01:44 -0400 (0:00:00.116) 0:00:40.358 ****** 30582 1726855304.00814: entering _queue_task() for managed_node3/dnf 30582 1726855304.01329: worker is 1 (out of 1 available) 30582 1726855304.01343: exiting _queue_task() for managed_node3/dnf 30582 1726855304.01355: done queuing things up, now waiting for results queue to drain 30582 1726855304.01357: waiting for pending results... 30582 1726855304.01718: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855304.01925: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d1b 30582 1726855304.01930: variable 'ansible_search_path' from source: unknown 30582 1726855304.01933: variable 'ansible_search_path' from source: unknown 30582 1726855304.01936: calling self._execute() 30582 1726855304.02027: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855304.02043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855304.02058: variable 'omit' from source: magic vars 30582 1726855304.02577: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.02581: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855304.02716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855304.05095: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855304.05551: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855304.05601: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855304.05646: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855304.05680: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855304.05778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.05815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.05993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.05996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.05999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.06039: variable 'ansible_distribution' from source: facts 30582 1726855304.06049: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.06069: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855304.06199: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855304.06346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.06376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.06409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.06459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.06482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.06530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.06566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.06599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.06639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.06665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.06716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.06744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.06793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.06881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.06885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.07015: variable 'network_connections' from source: include params 30582 1726855304.07032: variable 'interface' from source: play vars 30582 1726855304.07104: variable 'interface' from source: play vars 30582 1726855304.07182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855304.07371: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855304.07426: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855304.07463: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855304.07531: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855304.07567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855304.07598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855304.07640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.07750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855304.07753: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855304.08001: variable 'network_connections' from source: include params 30582 1726855304.08012: variable 'interface' from source: play vars 30582 1726855304.08086: variable 'interface' from source: play vars 30582 1726855304.08117: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855304.08124: when evaluation is False, skipping this task 30582 1726855304.08130: _execute() done 30582 1726855304.08136: dumping result to json 30582 1726855304.08141: done dumping result, returning 30582 1726855304.08152: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000d1b] 30582 1726855304.08161: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1b skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855304.08541: no more pending results, returning what we have 30582 1726855304.08544: results queue empty 30582 1726855304.08545: checking for any_errors_fatal 30582 1726855304.08552: done checking for any_errors_fatal 30582 1726855304.08553: checking for max_fail_percentage 30582 1726855304.08554: done checking for max_fail_percentage 30582 1726855304.08555: checking to see if all hosts have failed and the running result is not ok 30582 1726855304.08556: done checking to see if all hosts have failed 30582 1726855304.08556: getting the remaining hosts for this loop 30582 1726855304.08558: done getting the remaining hosts for this loop 30582 1726855304.08562: getting the next task for host managed_node3 30582 1726855304.08571: done getting next task for host managed_node3 30582 1726855304.08577: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855304.08582: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855304.08608: getting variables 30582 1726855304.08610: in VariableManager get_vars() 30582 1726855304.08648: Calling all_inventory to load vars for managed_node3 30582 1726855304.08651: Calling groups_inventory to load vars for managed_node3 30582 1726855304.08653: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855304.08665: Calling all_plugins_play to load vars for managed_node3 30582 1726855304.08668: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855304.08671: Calling groups_plugins_play to load vars for managed_node3 30582 1726855304.09238: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1b 30582 1726855304.09242: WORKER PROCESS EXITING 30582 1726855304.10460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855304.12089: done with get_vars() 30582 1726855304.12113: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855304.12199: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:01:44 -0400 (0:00:00.114) 0:00:40.472 ****** 30582 1726855304.12235: entering _queue_task() for managed_node3/yum 30582 1726855304.12619: worker is 1 (out of 1 available) 30582 1726855304.12799: exiting _queue_task() for managed_node3/yum 30582 1726855304.12811: done queuing things up, now waiting for results queue to drain 30582 1726855304.12812: waiting for pending results... 30582 1726855304.12956: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855304.13121: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d1c 30582 1726855304.13144: variable 'ansible_search_path' from source: unknown 30582 1726855304.13157: variable 'ansible_search_path' from source: unknown 30582 1726855304.13201: calling self._execute() 30582 1726855304.13306: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855304.13318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855304.13333: variable 'omit' from source: magic vars 30582 1726855304.13736: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.13753: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855304.13946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855304.16270: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855304.16363: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855304.16421: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855304.16460: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855304.16496: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855304.16585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.16689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.16694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.16708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.16728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.16844: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.16866: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855304.16876: when evaluation is False, skipping this task 30582 1726855304.16917: _execute() done 30582 1726855304.16920: dumping result to json 30582 1726855304.16922: done dumping result, returning 30582 1726855304.16925: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000d1c] 30582 1726855304.16927: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1c skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855304.17218: no more pending results, returning what we have 30582 1726855304.17223: results queue empty 30582 1726855304.17224: checking for any_errors_fatal 30582 1726855304.17233: done checking for any_errors_fatal 30582 1726855304.17234: checking for max_fail_percentage 30582 1726855304.17236: done checking for max_fail_percentage 30582 1726855304.17237: checking to see if all hosts have failed and the running result is not ok 30582 1726855304.17237: done checking to see if all hosts have failed 30582 1726855304.17238: getting the remaining hosts for this loop 30582 1726855304.17240: done getting the remaining hosts for this loop 30582 1726855304.17244: getting the next task for host managed_node3 30582 1726855304.17254: done getting next task for host managed_node3 30582 1726855304.17258: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855304.17263: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855304.17290: getting variables 30582 1726855304.17292: in VariableManager get_vars() 30582 1726855304.17333: Calling all_inventory to load vars for managed_node3 30582 1726855304.17336: Calling groups_inventory to load vars for managed_node3 30582 1726855304.17338: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855304.17350: Calling all_plugins_play to load vars for managed_node3 30582 1726855304.17354: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855304.17358: Calling groups_plugins_play to load vars for managed_node3 30582 1726855304.17905: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1c 30582 1726855304.17909: WORKER PROCESS EXITING 30582 1726855304.19017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855304.20899: done with get_vars() 30582 1726855304.20950: done getting variables 30582 1726855304.21169: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:01:44 -0400 (0:00:00.089) 0:00:40.562 ****** 30582 1726855304.21259: entering _queue_task() for managed_node3/fail 30582 1726855304.22059: worker is 1 (out of 1 available) 30582 1726855304.22071: exiting _queue_task() for managed_node3/fail 30582 1726855304.22084: done queuing things up, now waiting for results queue to drain 30582 1726855304.22085: waiting for pending results... 30582 1726855304.22434: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855304.22692: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d1d 30582 1726855304.22713: variable 'ansible_search_path' from source: unknown 30582 1726855304.22721: variable 'ansible_search_path' from source: unknown 30582 1726855304.22767: calling self._execute() 30582 1726855304.22879: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855304.22893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855304.22906: variable 'omit' from source: magic vars 30582 1726855304.23694: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.23698: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855304.23884: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855304.24328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855304.28074: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855304.28171: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855304.28248: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855304.28463: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855304.28467: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855304.28533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.28566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.28598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.28649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.28670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.28725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.28752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.28781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.28830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.28849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.28895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.28923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.28955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.29157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.29161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.29567: variable 'network_connections' from source: include params 30582 1726855304.29608: variable 'interface' from source: play vars 30582 1726855304.29734: variable 'interface' from source: play vars 30582 1726855304.30028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855304.30394: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855304.30444: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855304.30483: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855304.30697: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855304.30733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855304.30807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855304.30838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.30925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855304.31069: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855304.31697: variable 'network_connections' from source: include params 30582 1726855304.31700: variable 'interface' from source: play vars 30582 1726855304.31892: variable 'interface' from source: play vars 30582 1726855304.31896: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855304.31899: when evaluation is False, skipping this task 30582 1726855304.31901: _execute() done 30582 1726855304.31903: dumping result to json 30582 1726855304.31907: done dumping result, returning 30582 1726855304.31909: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000d1d] 30582 1726855304.31911: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1d skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855304.32197: no more pending results, returning what we have 30582 1726855304.32202: results queue empty 30582 1726855304.32203: checking for any_errors_fatal 30582 1726855304.32211: done checking for any_errors_fatal 30582 1726855304.32212: checking for max_fail_percentage 30582 1726855304.32214: done checking for max_fail_percentage 30582 1726855304.32215: checking to see if all hosts have failed and the running result is not ok 30582 1726855304.32216: done checking to see if all hosts have failed 30582 1726855304.32217: getting the remaining hosts for this loop 30582 1726855304.32218: done getting the remaining hosts for this loop 30582 1726855304.32222: getting the next task for host managed_node3 30582 1726855304.32231: done getting next task for host managed_node3 30582 1726855304.32234: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855304.32240: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855304.32379: getting variables 30582 1726855304.32382: in VariableManager get_vars() 30582 1726855304.32423: Calling all_inventory to load vars for managed_node3 30582 1726855304.32426: Calling groups_inventory to load vars for managed_node3 30582 1726855304.32429: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855304.32442: Calling all_plugins_play to load vars for managed_node3 30582 1726855304.32445: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855304.32447: Calling groups_plugins_play to load vars for managed_node3 30582 1726855304.33248: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1d 30582 1726855304.33251: WORKER PROCESS EXITING 30582 1726855304.35945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855304.37797: done with get_vars() 30582 1726855304.37823: done getting variables 30582 1726855304.37891: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:01:44 -0400 (0:00:00.166) 0:00:40.729 ****** 30582 1726855304.37927: entering _queue_task() for managed_node3/package 30582 1726855304.38304: worker is 1 (out of 1 available) 30582 1726855304.38317: exiting _queue_task() for managed_node3/package 30582 1726855304.38332: done queuing things up, now waiting for results queue to drain 30582 1726855304.38333: waiting for pending results... 30582 1726855304.38638: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855304.38807: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d1e 30582 1726855304.38817: variable 'ansible_search_path' from source: unknown 30582 1726855304.38821: variable 'ansible_search_path' from source: unknown 30582 1726855304.38858: calling self._execute() 30582 1726855304.38963: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855304.38966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855304.38980: variable 'omit' from source: magic vars 30582 1726855304.39431: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.39437: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855304.39611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855304.39936: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855304.39982: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855304.40309: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855304.40575: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855304.40579: variable 'network_packages' from source: role '' defaults 30582 1726855304.40653: variable '__network_provider_setup' from source: role '' defaults 30582 1726855304.40664: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855304.40735: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855304.40744: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855304.40806: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855304.40998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855304.43151: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855304.43219: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855304.43257: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855304.43295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855304.43322: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855304.43422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.43585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.43591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.43594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.43596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.43690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.43796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.43820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.43976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.43980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.44266: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855304.44316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.44340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.44364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.44593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.44597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.44599: variable 'ansible_python' from source: facts 30582 1726855304.44602: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855304.44630: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855304.44719: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855304.44935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.44938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.44942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.44944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.44952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.45004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.45031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.45053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.45103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.45116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.45326: variable 'network_connections' from source: include params 30582 1726855304.45338: variable 'interface' from source: play vars 30582 1726855304.45471: variable 'interface' from source: play vars 30582 1726855304.45560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855304.45597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855304.45628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.45656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855304.45710: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855304.46019: variable 'network_connections' from source: include params 30582 1726855304.46022: variable 'interface' from source: play vars 30582 1726855304.46130: variable 'interface' from source: play vars 30582 1726855304.46167: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855304.46249: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855304.46601: variable 'network_connections' from source: include params 30582 1726855304.46604: variable 'interface' from source: play vars 30582 1726855304.46672: variable 'interface' from source: play vars 30582 1726855304.46702: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855304.46783: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855304.47117: variable 'network_connections' from source: include params 30582 1726855304.47120: variable 'interface' from source: play vars 30582 1726855304.47318: variable 'interface' from source: play vars 30582 1726855304.47322: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855304.47324: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855304.47326: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855304.47367: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855304.47596: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855304.48546: variable 'network_connections' from source: include params 30582 1726855304.48557: variable 'interface' from source: play vars 30582 1726855304.48633: variable 'interface' from source: play vars 30582 1726855304.48646: variable 'ansible_distribution' from source: facts 30582 1726855304.48654: variable '__network_rh_distros' from source: role '' defaults 30582 1726855304.48667: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.48688: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855304.48919: variable 'ansible_distribution' from source: facts 30582 1726855304.48923: variable '__network_rh_distros' from source: role '' defaults 30582 1726855304.48925: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.48927: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855304.49261: variable 'ansible_distribution' from source: facts 30582 1726855304.49265: variable '__network_rh_distros' from source: role '' defaults 30582 1726855304.49267: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.49371: variable 'network_provider' from source: set_fact 30582 1726855304.49393: variable 'ansible_facts' from source: unknown 30582 1726855304.51307: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855304.51311: when evaluation is False, skipping this task 30582 1726855304.51321: _execute() done 30582 1726855304.51325: dumping result to json 30582 1726855304.51327: done dumping result, returning 30582 1726855304.51337: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000000d1e] 30582 1726855304.51342: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1e skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855304.51518: no more pending results, returning what we have 30582 1726855304.51522: results queue empty 30582 1726855304.51523: checking for any_errors_fatal 30582 1726855304.51535: done checking for any_errors_fatal 30582 1726855304.51537: checking for max_fail_percentage 30582 1726855304.51541: done checking for max_fail_percentage 30582 1726855304.51542: checking to see if all hosts have failed and the running result is not ok 30582 1726855304.51542: done checking to see if all hosts have failed 30582 1726855304.51543: getting the remaining hosts for this loop 30582 1726855304.51545: done getting the remaining hosts for this loop 30582 1726855304.51549: getting the next task for host managed_node3 30582 1726855304.51559: done getting next task for host managed_node3 30582 1726855304.51563: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855304.51569: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855304.51592: getting variables 30582 1726855304.51595: in VariableManager get_vars() 30582 1726855304.51837: Calling all_inventory to load vars for managed_node3 30582 1726855304.51840: Calling groups_inventory to load vars for managed_node3 30582 1726855304.51847: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855304.51883: Calling all_plugins_play to load vars for managed_node3 30582 1726855304.51891: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855304.51988: Calling groups_plugins_play to load vars for managed_node3 30582 1726855304.52546: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1e 30582 1726855304.52550: WORKER PROCESS EXITING 30582 1726855304.54335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855304.55385: done with get_vars() 30582 1726855304.55418: done getting variables 30582 1726855304.55520: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:01:44 -0400 (0:00:00.176) 0:00:40.905 ****** 30582 1726855304.55567: entering _queue_task() for managed_node3/package 30582 1726855304.56118: worker is 1 (out of 1 available) 30582 1726855304.56130: exiting _queue_task() for managed_node3/package 30582 1726855304.56141: done queuing things up, now waiting for results queue to drain 30582 1726855304.56143: waiting for pending results... 30582 1726855304.56731: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855304.57219: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d1f 30582 1726855304.57593: variable 'ansible_search_path' from source: unknown 30582 1726855304.57597: variable 'ansible_search_path' from source: unknown 30582 1726855304.57600: calling self._execute() 30582 1726855304.57882: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855304.57972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855304.57992: variable 'omit' from source: magic vars 30582 1726855304.58461: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.58523: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855304.58656: variable 'network_state' from source: role '' defaults 30582 1726855304.58669: Evaluated conditional (network_state != {}): False 30582 1726855304.58673: when evaluation is False, skipping this task 30582 1726855304.58678: _execute() done 30582 1726855304.58681: dumping result to json 30582 1726855304.58683: done dumping result, returning 30582 1726855304.58701: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000000d1f] 30582 1726855304.58704: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1f 30582 1726855304.58995: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d1f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855304.59043: no more pending results, returning what we have 30582 1726855304.59047: results queue empty 30582 1726855304.59048: checking for any_errors_fatal 30582 1726855304.59055: done checking for any_errors_fatal 30582 1726855304.59056: checking for max_fail_percentage 30582 1726855304.59058: done checking for max_fail_percentage 30582 1726855304.59059: checking to see if all hosts have failed and the running result is not ok 30582 1726855304.59060: done checking to see if all hosts have failed 30582 1726855304.59061: getting the remaining hosts for this loop 30582 1726855304.59063: done getting the remaining hosts for this loop 30582 1726855304.59066: getting the next task for host managed_node3 30582 1726855304.59074: done getting next task for host managed_node3 30582 1726855304.59078: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855304.59084: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855304.59107: getting variables 30582 1726855304.59109: in VariableManager get_vars() 30582 1726855304.59145: Calling all_inventory to load vars for managed_node3 30582 1726855304.59149: Calling groups_inventory to load vars for managed_node3 30582 1726855304.59151: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855304.59164: Calling all_plugins_play to load vars for managed_node3 30582 1726855304.59167: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855304.59171: Calling groups_plugins_play to load vars for managed_node3 30582 1726855304.59699: WORKER PROCESS EXITING 30582 1726855304.60860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855304.62882: done with get_vars() 30582 1726855304.62925: done getting variables 30582 1726855304.63020: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:01:44 -0400 (0:00:00.074) 0:00:40.980 ****** 30582 1726855304.63060: entering _queue_task() for managed_node3/package 30582 1726855304.63714: worker is 1 (out of 1 available) 30582 1726855304.63725: exiting _queue_task() for managed_node3/package 30582 1726855304.63736: done queuing things up, now waiting for results queue to drain 30582 1726855304.63738: waiting for pending results... 30582 1726855304.64151: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855304.64493: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d20 30582 1726855304.64559: variable 'ansible_search_path' from source: unknown 30582 1726855304.64563: variable 'ansible_search_path' from source: unknown 30582 1726855304.64567: calling self._execute() 30582 1726855304.64698: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855304.64708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855304.64721: variable 'omit' from source: magic vars 30582 1726855304.65140: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.65210: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855304.65297: variable 'network_state' from source: role '' defaults 30582 1726855304.65317: Evaluated conditional (network_state != {}): False 30582 1726855304.65325: when evaluation is False, skipping this task 30582 1726855304.65332: _execute() done 30582 1726855304.65339: dumping result to json 30582 1726855304.65345: done dumping result, returning 30582 1726855304.65358: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000000d20] 30582 1726855304.65367: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d20 30582 1726855304.65606: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d20 30582 1726855304.65616: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855304.65678: no more pending results, returning what we have 30582 1726855304.65682: results queue empty 30582 1726855304.65683: checking for any_errors_fatal 30582 1726855304.65715: done checking for any_errors_fatal 30582 1726855304.65717: checking for max_fail_percentage 30582 1726855304.65720: done checking for max_fail_percentage 30582 1726855304.65721: checking to see if all hosts have failed and the running result is not ok 30582 1726855304.65721: done checking to see if all hosts have failed 30582 1726855304.65722: getting the remaining hosts for this loop 30582 1726855304.65723: done getting the remaining hosts for this loop 30582 1726855304.65727: getting the next task for host managed_node3 30582 1726855304.65737: done getting next task for host managed_node3 30582 1726855304.65741: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855304.65746: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855304.65770: getting variables 30582 1726855304.65774: in VariableManager get_vars() 30582 1726855304.66025: Calling all_inventory to load vars for managed_node3 30582 1726855304.66028: Calling groups_inventory to load vars for managed_node3 30582 1726855304.66033: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855304.66054: Calling all_plugins_play to load vars for managed_node3 30582 1726855304.66057: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855304.66061: Calling groups_plugins_play to load vars for managed_node3 30582 1726855304.68047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855304.69912: done with get_vars() 30582 1726855304.69956: done getting variables 30582 1726855304.70092: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:01:44 -0400 (0:00:00.070) 0:00:41.051 ****** 30582 1726855304.70145: entering _queue_task() for managed_node3/service 30582 1726855304.70649: worker is 1 (out of 1 available) 30582 1726855304.70666: exiting _queue_task() for managed_node3/service 30582 1726855304.70681: done queuing things up, now waiting for results queue to drain 30582 1726855304.70683: waiting for pending results... 30582 1726855304.71134: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855304.71139: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d21 30582 1726855304.71228: variable 'ansible_search_path' from source: unknown 30582 1726855304.71232: variable 'ansible_search_path' from source: unknown 30582 1726855304.71235: calling self._execute() 30582 1726855304.71296: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855304.71300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855304.71310: variable 'omit' from source: magic vars 30582 1726855304.71753: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.71776: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855304.72004: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855304.72143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855304.74976: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855304.74981: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855304.75060: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855304.75064: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855304.75103: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855304.75266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.75274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.75280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.75390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.75394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.75396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.75432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.75469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.75523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.75691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.75695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.75697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.75700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.75703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.75705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.75938: variable 'network_connections' from source: include params 30582 1726855304.75984: variable 'interface' from source: play vars 30582 1726855304.76239: variable 'interface' from source: play vars 30582 1726855304.76242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855304.76357: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855304.76417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855304.76449: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855304.76473: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855304.76668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855304.76671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855304.76674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.76676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855304.76678: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855304.76999: variable 'network_connections' from source: include params 30582 1726855304.77007: variable 'interface' from source: play vars 30582 1726855304.77131: variable 'interface' from source: play vars 30582 1726855304.77134: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855304.77137: when evaluation is False, skipping this task 30582 1726855304.77139: _execute() done 30582 1726855304.77141: dumping result to json 30582 1726855304.77143: done dumping result, returning 30582 1726855304.77145: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000000d21] 30582 1726855304.77147: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d21 30582 1726855304.77386: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d21 30582 1726855304.77458: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855304.77490: no more pending results, returning what we have 30582 1726855304.77494: results queue empty 30582 1726855304.77500: checking for any_errors_fatal 30582 1726855304.77511: done checking for any_errors_fatal 30582 1726855304.77512: checking for max_fail_percentage 30582 1726855304.77514: done checking for max_fail_percentage 30582 1726855304.77518: checking to see if all hosts have failed and the running result is not ok 30582 1726855304.77519: done checking to see if all hosts have failed 30582 1726855304.77520: getting the remaining hosts for this loop 30582 1726855304.77522: done getting the remaining hosts for this loop 30582 1726855304.77526: getting the next task for host managed_node3 30582 1726855304.77536: done getting next task for host managed_node3 30582 1726855304.77543: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855304.77548: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855304.77577: getting variables 30582 1726855304.77580: in VariableManager get_vars() 30582 1726855304.77633: Calling all_inventory to load vars for managed_node3 30582 1726855304.77637: Calling groups_inventory to load vars for managed_node3 30582 1726855304.77639: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855304.77655: Calling all_plugins_play to load vars for managed_node3 30582 1726855304.77659: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855304.77663: Calling groups_plugins_play to load vars for managed_node3 30582 1726855304.80417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855304.82360: done with get_vars() 30582 1726855304.82395: done getting variables 30582 1726855304.82459: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:01:44 -0400 (0:00:00.123) 0:00:41.174 ****** 30582 1726855304.82504: entering _queue_task() for managed_node3/service 30582 1726855304.83094: worker is 1 (out of 1 available) 30582 1726855304.83104: exiting _queue_task() for managed_node3/service 30582 1726855304.83115: done queuing things up, now waiting for results queue to drain 30582 1726855304.83116: waiting for pending results... 30582 1726855304.83223: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855304.83495: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d22 30582 1726855304.83499: variable 'ansible_search_path' from source: unknown 30582 1726855304.83502: variable 'ansible_search_path' from source: unknown 30582 1726855304.83505: calling self._execute() 30582 1726855304.83536: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855304.83546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855304.83569: variable 'omit' from source: magic vars 30582 1726855304.83969: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.83992: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855304.84174: variable 'network_provider' from source: set_fact 30582 1726855304.84186: variable 'network_state' from source: role '' defaults 30582 1726855304.84215: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855304.84231: variable 'omit' from source: magic vars 30582 1726855304.84301: variable 'omit' from source: magic vars 30582 1726855304.84343: variable 'network_service_name' from source: role '' defaults 30582 1726855304.84409: variable 'network_service_name' from source: role '' defaults 30582 1726855304.84524: variable '__network_provider_setup' from source: role '' defaults 30582 1726855304.84650: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855304.84653: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855304.84656: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855304.84695: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855304.84933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855304.87903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855304.87982: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855304.88041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855304.88093: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855304.88127: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855304.88309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.88313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.88316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.88335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.88357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.88419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.88447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.88476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.88532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.88553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.88808: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855304.88944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.88985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.89018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.89060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.89177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.89192: variable 'ansible_python' from source: facts 30582 1726855304.89215: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855304.89400: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855304.89403: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855304.89830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.89852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.89884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.90167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.90171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.90233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855304.90304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855304.90418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.90464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855304.90511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855304.91020: variable 'network_connections' from source: include params 30582 1726855304.91023: variable 'interface' from source: play vars 30582 1726855304.91128: variable 'interface' from source: play vars 30582 1726855304.91315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855304.91701: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855304.91828: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855304.91940: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855304.92113: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855304.92175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855304.92258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855304.92346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855304.92600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855304.92603: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855304.93235: variable 'network_connections' from source: include params 30582 1726855304.93306: variable 'interface' from source: play vars 30582 1726855304.93502: variable 'interface' from source: play vars 30582 1726855304.93538: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855304.93650: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855304.94395: variable 'network_connections' from source: include params 30582 1726855304.94406: variable 'interface' from source: play vars 30582 1726855304.94550: variable 'interface' from source: play vars 30582 1726855304.94608: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855304.94866: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855304.95677: variable 'network_connections' from source: include params 30582 1726855304.95795: variable 'interface' from source: play vars 30582 1726855304.95935: variable 'interface' from source: play vars 30582 1726855304.96239: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855304.96376: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855304.96596: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855304.96723: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855304.97392: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855304.98540: variable 'network_connections' from source: include params 30582 1726855304.98552: variable 'interface' from source: play vars 30582 1726855304.98622: variable 'interface' from source: play vars 30582 1726855304.98635: variable 'ansible_distribution' from source: facts 30582 1726855304.98644: variable '__network_rh_distros' from source: role '' defaults 30582 1726855304.98655: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.98675: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855304.98863: variable 'ansible_distribution' from source: facts 30582 1726855304.98875: variable '__network_rh_distros' from source: role '' defaults 30582 1726855304.98893: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.98930: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855304.99334: variable 'ansible_distribution' from source: facts 30582 1726855304.99634: variable '__network_rh_distros' from source: role '' defaults 30582 1726855304.99637: variable 'ansible_distribution_major_version' from source: facts 30582 1726855304.99639: variable 'network_provider' from source: set_fact 30582 1726855304.99641: variable 'omit' from source: magic vars 30582 1726855304.99643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855304.99721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855304.99744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855304.99832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855304.99933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855304.99972: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855305.00058: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855305.00244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855305.00908: Set connection var ansible_timeout to 10 30582 1726855305.00910: Set connection var ansible_connection to ssh 30582 1726855305.00912: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855305.00914: Set connection var ansible_pipelining to False 30582 1726855305.00916: Set connection var ansible_shell_executable to /bin/sh 30582 1726855305.00918: Set connection var ansible_shell_type to sh 30582 1726855305.00919: variable 'ansible_shell_executable' from source: unknown 30582 1726855305.00921: variable 'ansible_connection' from source: unknown 30582 1726855305.00923: variable 'ansible_module_compression' from source: unknown 30582 1726855305.00924: variable 'ansible_shell_type' from source: unknown 30582 1726855305.00926: variable 'ansible_shell_executable' from source: unknown 30582 1726855305.00927: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855305.00929: variable 'ansible_pipelining' from source: unknown 30582 1726855305.00931: variable 'ansible_timeout' from source: unknown 30582 1726855305.00933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855305.01072: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855305.01169: variable 'omit' from source: magic vars 30582 1726855305.01180: starting attempt loop 30582 1726855305.01297: running the handler 30582 1726855305.01596: variable 'ansible_facts' from source: unknown 30582 1726855305.03113: _low_level_execute_command(): starting 30582 1726855305.03134: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855305.04127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855305.04541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855305.04581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855305.04673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855305.06570: stdout chunk (state=3): >>>/root <<< 30582 1726855305.06574: stdout chunk (state=3): >>><<< 30582 1726855305.06576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855305.06578: stderr chunk (state=3): >>><<< 30582 1726855305.06683: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855305.06689: _low_level_execute_command(): starting 30582 1726855305.06692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366 `" && echo ansible-tmp-1726855305.066481-32483-224629150113366="` echo /root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366 `" ) && sleep 0' 30582 1726855305.07458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855305.07514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855305.07597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855305.07624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855305.07638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855305.07728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855305.09659: stdout chunk (state=3): >>>ansible-tmp-1726855305.066481-32483-224629150113366=/root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366 <<< 30582 1726855305.09752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855305.09797: stderr chunk (state=3): >>><<< 30582 1726855305.09800: stdout chunk (state=3): >>><<< 30582 1726855305.09803: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855305.066481-32483-224629150113366=/root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855305.10002: variable 'ansible_module_compression' from source: unknown 30582 1726855305.10005: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855305.10007: variable 'ansible_facts' from source: unknown 30582 1726855305.10153: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/AnsiballZ_systemd.py 30582 1726855305.10363: Sending initial data 30582 1726855305.10378: Sent initial data (155 bytes) 30582 1726855305.10839: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855305.10880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855305.10932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855305.11001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855305.11015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855305.11043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855305.11303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855305.12944: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855305.12949: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855305.13027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855305.13095: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp35sv8jc1 /root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/AnsiballZ_systemd.py <<< 30582 1726855305.13100: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/AnsiballZ_systemd.py" <<< 30582 1726855305.13173: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp35sv8jc1" to remote "/root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/AnsiballZ_systemd.py" <<< 30582 1726855305.14570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855305.14646: stderr chunk (state=3): >>><<< 30582 1726855305.14648: stdout chunk (state=3): >>><<< 30582 1726855305.14672: done transferring module to remote 30582 1726855305.14697: _low_level_execute_command(): starting 30582 1726855305.14700: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/ /root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/AnsiballZ_systemd.py && sleep 0' 30582 1726855305.15394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855305.15398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855305.15460: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855305.15464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855305.15467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855305.15624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855305.15750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855305.17569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855305.17603: stderr chunk (state=3): >>><<< 30582 1726855305.17606: stdout chunk (state=3): >>><<< 30582 1726855305.17619: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855305.17621: _low_level_execute_command(): starting 30582 1726855305.17627: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/AnsiballZ_systemd.py && sleep 0' 30582 1726855305.18044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855305.18047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855305.18050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855305.18052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855305.18093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855305.18099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855305.18114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855305.18176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855305.47722: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10657792", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321896960", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2095095000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855305.47734: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855305.49848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855305.49965: stderr chunk (state=3): >>><<< 30582 1726855305.49969: stdout chunk (state=3): >>><<< 30582 1726855305.50216: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10657792", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321896960", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2095095000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855305.50291: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855305.50370: _low_level_execute_command(): starting 30582 1726855305.50381: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855305.066481-32483-224629150113366/ > /dev/null 2>&1 && sleep 0' 30582 1726855305.51444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855305.51465: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855305.51552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855305.51580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855305.51610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855305.51830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855305.53784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855305.53812: stdout chunk (state=3): >>><<< 30582 1726855305.53833: stderr chunk (state=3): >>><<< 30582 1726855305.53857: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855305.53993: handler run complete 30582 1726855305.53996: attempt loop complete, returning result 30582 1726855305.53999: _execute() done 30582 1726855305.54001: dumping result to json 30582 1726855305.54003: done dumping result, returning 30582 1726855305.54005: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-000000000d22] 30582 1726855305.54007: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d22 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855305.54985: no more pending results, returning what we have 30582 1726855305.54990: results queue empty 30582 1726855305.54992: checking for any_errors_fatal 30582 1726855305.54997: done checking for any_errors_fatal 30582 1726855305.54997: checking for max_fail_percentage 30582 1726855305.54999: done checking for max_fail_percentage 30582 1726855305.55000: checking to see if all hosts have failed and the running result is not ok 30582 1726855305.55001: done checking to see if all hosts have failed 30582 1726855305.55001: getting the remaining hosts for this loop 30582 1726855305.55003: done getting the remaining hosts for this loop 30582 1726855305.55006: getting the next task for host managed_node3 30582 1726855305.55098: done getting next task for host managed_node3 30582 1726855305.55106: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855305.55112: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855305.55135: getting variables 30582 1726855305.55137: in VariableManager get_vars() 30582 1726855305.55172: Calling all_inventory to load vars for managed_node3 30582 1726855305.55176: Calling groups_inventory to load vars for managed_node3 30582 1726855305.55178: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855305.55341: Calling all_plugins_play to load vars for managed_node3 30582 1726855305.55346: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855305.55350: Calling groups_plugins_play to load vars for managed_node3 30582 1726855305.55869: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d22 30582 1726855305.55876: WORKER PROCESS EXITING 30582 1726855305.57415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855305.59080: done with get_vars() 30582 1726855305.59107: done getting variables 30582 1726855305.59182: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:01:45 -0400 (0:00:00.767) 0:00:41.942 ****** 30582 1726855305.59226: entering _queue_task() for managed_node3/service 30582 1726855305.59595: worker is 1 (out of 1 available) 30582 1726855305.59610: exiting _queue_task() for managed_node3/service 30582 1726855305.59623: done queuing things up, now waiting for results queue to drain 30582 1726855305.59624: waiting for pending results... 30582 1726855305.59860: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855305.60011: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d23 30582 1726855305.60031: variable 'ansible_search_path' from source: unknown 30582 1726855305.60039: variable 'ansible_search_path' from source: unknown 30582 1726855305.60083: calling self._execute() 30582 1726855305.60264: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855305.60268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855305.60279: variable 'omit' from source: magic vars 30582 1726855305.60893: variable 'ansible_distribution_major_version' from source: facts 30582 1726855305.60896: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855305.60903: variable 'network_provider' from source: set_fact 30582 1726855305.60906: Evaluated conditional (network_provider == "nm"): True 30582 1726855305.60952: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855305.61050: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855305.61246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855305.63393: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855305.63458: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855305.63498: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855305.63536: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855305.63563: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855305.63661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855305.63692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855305.63715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855305.63757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855305.63769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855305.63819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855305.63843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855305.63995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855305.63998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855305.64001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855305.64003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855305.64005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855305.64070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855305.64073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855305.64079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855305.64279: variable 'network_connections' from source: include params 30582 1726855305.64282: variable 'interface' from source: play vars 30582 1726855305.64285: variable 'interface' from source: play vars 30582 1726855305.64356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855305.64524: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855305.64560: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855305.64591: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855305.64623: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855305.64665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855305.64686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855305.64715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855305.64741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855305.64790: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855305.65078: variable 'network_connections' from source: include params 30582 1726855305.65081: variable 'interface' from source: play vars 30582 1726855305.65194: variable 'interface' from source: play vars 30582 1726855305.65198: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855305.65200: when evaluation is False, skipping this task 30582 1726855305.65202: _execute() done 30582 1726855305.65205: dumping result to json 30582 1726855305.65207: done dumping result, returning 30582 1726855305.65209: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-000000000d23] 30582 1726855305.65220: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d23 30582 1726855305.65593: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d23 30582 1726855305.65596: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855305.65636: no more pending results, returning what we have 30582 1726855305.65639: results queue empty 30582 1726855305.65640: checking for any_errors_fatal 30582 1726855305.65659: done checking for any_errors_fatal 30582 1726855305.65660: checking for max_fail_percentage 30582 1726855305.65662: done checking for max_fail_percentage 30582 1726855305.65663: checking to see if all hosts have failed and the running result is not ok 30582 1726855305.65664: done checking to see if all hosts have failed 30582 1726855305.65665: getting the remaining hosts for this loop 30582 1726855305.65666: done getting the remaining hosts for this loop 30582 1726855305.65669: getting the next task for host managed_node3 30582 1726855305.65680: done getting next task for host managed_node3 30582 1726855305.65685: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855305.65691: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855305.65739: getting variables 30582 1726855305.65741: in VariableManager get_vars() 30582 1726855305.65779: Calling all_inventory to load vars for managed_node3 30582 1726855305.65782: Calling groups_inventory to load vars for managed_node3 30582 1726855305.65784: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855305.65795: Calling all_plugins_play to load vars for managed_node3 30582 1726855305.65798: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855305.65801: Calling groups_plugins_play to load vars for managed_node3 30582 1726855305.67378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855305.69204: done with get_vars() 30582 1726855305.69227: done getting variables 30582 1726855305.69289: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:01:45 -0400 (0:00:00.101) 0:00:42.043 ****** 30582 1726855305.69329: entering _queue_task() for managed_node3/service 30582 1726855305.69791: worker is 1 (out of 1 available) 30582 1726855305.69805: exiting _queue_task() for managed_node3/service 30582 1726855305.69816: done queuing things up, now waiting for results queue to drain 30582 1726855305.69817: waiting for pending results... 30582 1726855305.70031: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855305.70181: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d24 30582 1726855305.70218: variable 'ansible_search_path' from source: unknown 30582 1726855305.70241: variable 'ansible_search_path' from source: unknown 30582 1726855305.70494: calling self._execute() 30582 1726855305.70497: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855305.70500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855305.70502: variable 'omit' from source: magic vars 30582 1726855305.70908: variable 'ansible_distribution_major_version' from source: facts 30582 1726855305.70979: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855305.71229: variable 'network_provider' from source: set_fact 30582 1726855305.71242: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855305.71250: when evaluation is False, skipping this task 30582 1726855305.71258: _execute() done 30582 1726855305.71266: dumping result to json 30582 1726855305.71281: done dumping result, returning 30582 1726855305.71296: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-000000000d24] 30582 1726855305.71307: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d24 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855305.71561: no more pending results, returning what we have 30582 1726855305.71565: results queue empty 30582 1726855305.71566: checking for any_errors_fatal 30582 1726855305.71576: done checking for any_errors_fatal 30582 1726855305.71577: checking for max_fail_percentage 30582 1726855305.71579: done checking for max_fail_percentage 30582 1726855305.71580: checking to see if all hosts have failed and the running result is not ok 30582 1726855305.71581: done checking to see if all hosts have failed 30582 1726855305.71582: getting the remaining hosts for this loop 30582 1726855305.71583: done getting the remaining hosts for this loop 30582 1726855305.71589: getting the next task for host managed_node3 30582 1726855305.71598: done getting next task for host managed_node3 30582 1726855305.71602: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855305.71608: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855305.71631: getting variables 30582 1726855305.71633: in VariableManager get_vars() 30582 1726855305.71669: Calling all_inventory to load vars for managed_node3 30582 1726855305.71672: Calling groups_inventory to load vars for managed_node3 30582 1726855305.71674: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855305.71686: Calling all_plugins_play to load vars for managed_node3 30582 1726855305.71768: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855305.71808: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d24 30582 1726855305.71811: WORKER PROCESS EXITING 30582 1726855305.71823: Calling groups_plugins_play to load vars for managed_node3 30582 1726855305.74383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855305.77729: done with get_vars() 30582 1726855305.77828: done getting variables 30582 1726855305.77898: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:01:45 -0400 (0:00:00.086) 0:00:42.129 ****** 30582 1726855305.77941: entering _queue_task() for managed_node3/copy 30582 1726855305.78607: worker is 1 (out of 1 available) 30582 1726855305.78620: exiting _queue_task() for managed_node3/copy 30582 1726855305.78631: done queuing things up, now waiting for results queue to drain 30582 1726855305.78634: waiting for pending results... 30582 1726855305.78889: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855305.79050: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d25 30582 1726855305.79070: variable 'ansible_search_path' from source: unknown 30582 1726855305.79081: variable 'ansible_search_path' from source: unknown 30582 1726855305.79127: calling self._execute() 30582 1726855305.79226: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855305.79244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855305.79259: variable 'omit' from source: magic vars 30582 1726855305.79728: variable 'ansible_distribution_major_version' from source: facts 30582 1726855305.79744: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855305.79894: variable 'network_provider' from source: set_fact 30582 1726855305.79899: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855305.79901: when evaluation is False, skipping this task 30582 1726855305.79903: _execute() done 30582 1726855305.79906: dumping result to json 30582 1726855305.80092: done dumping result, returning 30582 1726855305.80097: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-000000000d25] 30582 1726855305.80100: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d25 30582 1726855305.80170: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d25 30582 1726855305.80177: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855305.80227: no more pending results, returning what we have 30582 1726855305.80231: results queue empty 30582 1726855305.80232: checking for any_errors_fatal 30582 1726855305.80238: done checking for any_errors_fatal 30582 1726855305.80239: checking for max_fail_percentage 30582 1726855305.80241: done checking for max_fail_percentage 30582 1726855305.80242: checking to see if all hosts have failed and the running result is not ok 30582 1726855305.80243: done checking to see if all hosts have failed 30582 1726855305.80244: getting the remaining hosts for this loop 30582 1726855305.80246: done getting the remaining hosts for this loop 30582 1726855305.80250: getting the next task for host managed_node3 30582 1726855305.80259: done getting next task for host managed_node3 30582 1726855305.80264: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855305.80269: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855305.80304: getting variables 30582 1726855305.80307: in VariableManager get_vars() 30582 1726855305.80346: Calling all_inventory to load vars for managed_node3 30582 1726855305.80349: Calling groups_inventory to load vars for managed_node3 30582 1726855305.80352: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855305.80364: Calling all_plugins_play to load vars for managed_node3 30582 1726855305.80368: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855305.80371: Calling groups_plugins_play to load vars for managed_node3 30582 1726855305.82953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855305.84609: done with get_vars() 30582 1726855305.84642: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:01:45 -0400 (0:00:00.070) 0:00:42.199 ****** 30582 1726855305.84945: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855305.85647: worker is 1 (out of 1 available) 30582 1726855305.85663: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855305.85679: done queuing things up, now waiting for results queue to drain 30582 1726855305.85681: waiting for pending results... 30582 1726855305.86416: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855305.86621: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d26 30582 1726855305.86625: variable 'ansible_search_path' from source: unknown 30582 1726855305.86627: variable 'ansible_search_path' from source: unknown 30582 1726855305.86643: calling self._execute() 30582 1726855305.86757: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855305.86770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855305.86795: variable 'omit' from source: magic vars 30582 1726855305.87202: variable 'ansible_distribution_major_version' from source: facts 30582 1726855305.87225: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855305.87276: variable 'omit' from source: magic vars 30582 1726855305.87323: variable 'omit' from source: magic vars 30582 1726855305.87512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855305.91118: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855305.91204: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855305.91291: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855305.91295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855305.91328: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855305.91421: variable 'network_provider' from source: set_fact 30582 1726855305.91558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855305.91593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855305.91714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855305.91721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855305.91723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855305.91780: variable 'omit' from source: magic vars 30582 1726855305.91892: variable 'omit' from source: magic vars 30582 1726855305.92007: variable 'network_connections' from source: include params 30582 1726855305.92023: variable 'interface' from source: play vars 30582 1726855305.92099: variable 'interface' from source: play vars 30582 1726855305.92317: variable 'omit' from source: magic vars 30582 1726855305.92340: variable '__lsr_ansible_managed' from source: task vars 30582 1726855305.92429: variable '__lsr_ansible_managed' from source: task vars 30582 1726855305.93001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855305.93196: Loaded config def from plugin (lookup/template) 30582 1726855305.93300: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855305.93343: File lookup term: get_ansible_managed.j2 30582 1726855305.93440: variable 'ansible_search_path' from source: unknown 30582 1726855305.93452: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855305.93480: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855305.93509: variable 'ansible_search_path' from source: unknown 30582 1726855306.04320: variable 'ansible_managed' from source: unknown 30582 1726855306.04741: variable 'omit' from source: magic vars 30582 1726855306.04875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855306.05108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855306.05141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855306.05277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855306.05333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855306.05594: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855306.05792: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.05796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.05798: Set connection var ansible_timeout to 10 30582 1726855306.05800: Set connection var ansible_connection to ssh 30582 1726855306.05802: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855306.05804: Set connection var ansible_pipelining to False 30582 1726855306.05806: Set connection var ansible_shell_executable to /bin/sh 30582 1726855306.05808: Set connection var ansible_shell_type to sh 30582 1726855306.05810: variable 'ansible_shell_executable' from source: unknown 30582 1726855306.05812: variable 'ansible_connection' from source: unknown 30582 1726855306.05814: variable 'ansible_module_compression' from source: unknown 30582 1726855306.05816: variable 'ansible_shell_type' from source: unknown 30582 1726855306.05818: variable 'ansible_shell_executable' from source: unknown 30582 1726855306.05820: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.05847: variable 'ansible_pipelining' from source: unknown 30582 1726855306.05856: variable 'ansible_timeout' from source: unknown 30582 1726855306.05866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.06318: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855306.06408: variable 'omit' from source: magic vars 30582 1726855306.06421: starting attempt loop 30582 1726855306.06504: running the handler 30582 1726855306.06518: _low_level_execute_command(): starting 30582 1726855306.06521: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855306.07749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855306.07879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855306.07915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855306.07947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855306.08209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855306.09717: stdout chunk (state=3): >>>/root <<< 30582 1726855306.09891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855306.09895: stdout chunk (state=3): >>><<< 30582 1726855306.09898: stderr chunk (state=3): >>><<< 30582 1726855306.09920: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855306.10023: _low_level_execute_command(): starting 30582 1726855306.10027: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097 `" && echo ansible-tmp-1726855306.0992706-32532-227670290471097="` echo /root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097 `" ) && sleep 0' 30582 1726855306.10812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855306.10897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855306.11025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855306.11058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855306.11088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855306.11130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855306.11279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855306.13154: stdout chunk (state=3): >>>ansible-tmp-1726855306.0992706-32532-227670290471097=/root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097 <<< 30582 1726855306.13316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855306.13321: stdout chunk (state=3): >>><<< 30582 1726855306.13323: stderr chunk (state=3): >>><<< 30582 1726855306.13495: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855306.0992706-32532-227670290471097=/root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855306.13499: variable 'ansible_module_compression' from source: unknown 30582 1726855306.13502: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855306.13504: variable 'ansible_facts' from source: unknown 30582 1726855306.13617: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/AnsiballZ_network_connections.py 30582 1726855306.13866: Sending initial data 30582 1726855306.13869: Sent initial data (168 bytes) 30582 1726855306.14508: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855306.14531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855306.14621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855306.14652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855306.14694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855306.14697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855306.14768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855306.16384: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30582 1726855306.16390: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855306.16470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855306.16553: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpjnxpy7i6 /root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/AnsiballZ_network_connections.py <<< 30582 1726855306.16556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/AnsiballZ_network_connections.py" <<< 30582 1726855306.16652: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpjnxpy7i6" to remote "/root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/AnsiballZ_network_connections.py" <<< 30582 1726855306.18583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855306.18589: stdout chunk (state=3): >>><<< 30582 1726855306.18591: stderr chunk (state=3): >>><<< 30582 1726855306.18670: done transferring module to remote 30582 1726855306.18673: _low_level_execute_command(): starting 30582 1726855306.18676: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/ /root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/AnsiballZ_network_connections.py && sleep 0' 30582 1726855306.19385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855306.19443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855306.19561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855306.19582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855306.19615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855306.19704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855306.21590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855306.21693: stderr chunk (state=3): >>><<< 30582 1726855306.21696: stdout chunk (state=3): >>><<< 30582 1726855306.21699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855306.21701: _low_level_execute_command(): starting 30582 1726855306.21703: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/AnsiballZ_network_connections.py && sleep 0' 30582 1726855306.22513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855306.22588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855306.22647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855306.22660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855306.22766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855306.22854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855306.48026: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855306.49782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855306.49823: stdout chunk (state=3): >>><<< 30582 1726855306.49826: stderr chunk (state=3): >>><<< 30582 1726855306.49843: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855306.49893: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855306.49993: _low_level_execute_command(): starting 30582 1726855306.49996: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855306.0992706-32532-227670290471097/ > /dev/null 2>&1 && sleep 0' 30582 1726855306.50649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855306.50680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855306.50703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855306.50795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855306.50839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855306.50856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855306.50902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855306.51004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855306.52999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855306.53066: stderr chunk (state=3): >>><<< 30582 1726855306.53081: stdout chunk (state=3): >>><<< 30582 1726855306.53193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855306.53197: handler run complete 30582 1726855306.53199: attempt loop complete, returning result 30582 1726855306.53201: _execute() done 30582 1726855306.53204: dumping result to json 30582 1726855306.53206: done dumping result, returning 30582 1726855306.53210: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-000000000d26] 30582 1726855306.53212: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d26 ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508 skipped because already active 30582 1726855306.53532: no more pending results, returning what we have 30582 1726855306.53537: results queue empty 30582 1726855306.53538: checking for any_errors_fatal 30582 1726855306.53547: done checking for any_errors_fatal 30582 1726855306.53548: checking for max_fail_percentage 30582 1726855306.53901: done checking for max_fail_percentage 30582 1726855306.53903: checking to see if all hosts have failed and the running result is not ok 30582 1726855306.53904: done checking to see if all hosts have failed 30582 1726855306.53911: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d26 30582 1726855306.53919: WORKER PROCESS EXITING 30582 1726855306.53915: getting the remaining hosts for this loop 30582 1726855306.53922: done getting the remaining hosts for this loop 30582 1726855306.53926: getting the next task for host managed_node3 30582 1726855306.53936: done getting next task for host managed_node3 30582 1726855306.53940: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855306.53954: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855306.53967: getting variables 30582 1726855306.53969: in VariableManager get_vars() 30582 1726855306.54158: Calling all_inventory to load vars for managed_node3 30582 1726855306.54161: Calling groups_inventory to load vars for managed_node3 30582 1726855306.54163: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855306.54177: Calling all_plugins_play to load vars for managed_node3 30582 1726855306.54181: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855306.54185: Calling groups_plugins_play to load vars for managed_node3 30582 1726855306.57285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855306.59473: done with get_vars() 30582 1726855306.59512: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:01:46 -0400 (0:00:00.746) 0:00:42.945 ****** 30582 1726855306.59616: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855306.60093: worker is 1 (out of 1 available) 30582 1726855306.60108: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855306.60121: done queuing things up, now waiting for results queue to drain 30582 1726855306.60122: waiting for pending results... 30582 1726855306.60678: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855306.61193: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d27 30582 1726855306.61199: variable 'ansible_search_path' from source: unknown 30582 1726855306.61201: variable 'ansible_search_path' from source: unknown 30582 1726855306.61240: calling self._execute() 30582 1726855306.61461: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.61647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.61650: variable 'omit' from source: magic vars 30582 1726855306.62334: variable 'ansible_distribution_major_version' from source: facts 30582 1726855306.62478: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855306.62801: variable 'network_state' from source: role '' defaults 30582 1726855306.62854: Evaluated conditional (network_state != {}): False 30582 1726855306.62862: when evaluation is False, skipping this task 30582 1726855306.62907: _execute() done 30582 1726855306.62916: dumping result to json 30582 1726855306.62924: done dumping result, returning 30582 1726855306.62937: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-000000000d27] 30582 1726855306.62947: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d27 30582 1726855306.63360: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d27 30582 1726855306.63364: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855306.63422: no more pending results, returning what we have 30582 1726855306.63426: results queue empty 30582 1726855306.63427: checking for any_errors_fatal 30582 1726855306.63443: done checking for any_errors_fatal 30582 1726855306.63444: checking for max_fail_percentage 30582 1726855306.63446: done checking for max_fail_percentage 30582 1726855306.63447: checking to see if all hosts have failed and the running result is not ok 30582 1726855306.63448: done checking to see if all hosts have failed 30582 1726855306.63449: getting the remaining hosts for this loop 30582 1726855306.63450: done getting the remaining hosts for this loop 30582 1726855306.63454: getting the next task for host managed_node3 30582 1726855306.63464: done getting next task for host managed_node3 30582 1726855306.63469: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855306.63476: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855306.63502: getting variables 30582 1726855306.63504: in VariableManager get_vars() 30582 1726855306.63542: Calling all_inventory to load vars for managed_node3 30582 1726855306.63545: Calling groups_inventory to load vars for managed_node3 30582 1726855306.63547: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855306.63560: Calling all_plugins_play to load vars for managed_node3 30582 1726855306.63563: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855306.63566: Calling groups_plugins_play to load vars for managed_node3 30582 1726855306.65670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855306.67648: done with get_vars() 30582 1726855306.67680: done getting variables 30582 1726855306.67751: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:01:46 -0400 (0:00:00.081) 0:00:43.027 ****** 30582 1726855306.67792: entering _queue_task() for managed_node3/debug 30582 1726855306.68289: worker is 1 (out of 1 available) 30582 1726855306.68301: exiting _queue_task() for managed_node3/debug 30582 1726855306.68312: done queuing things up, now waiting for results queue to drain 30582 1726855306.68314: waiting for pending results... 30582 1726855306.68727: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855306.68872: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d28 30582 1726855306.68935: variable 'ansible_search_path' from source: unknown 30582 1726855306.68939: variable 'ansible_search_path' from source: unknown 30582 1726855306.68971: calling self._execute() 30582 1726855306.69089: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.69145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.69148: variable 'omit' from source: magic vars 30582 1726855306.69532: variable 'ansible_distribution_major_version' from source: facts 30582 1726855306.69547: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855306.69555: variable 'omit' from source: magic vars 30582 1726855306.69632: variable 'omit' from source: magic vars 30582 1726855306.69670: variable 'omit' from source: magic vars 30582 1726855306.69797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855306.69801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855306.69804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855306.69831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855306.69849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855306.69886: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855306.69903: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.69911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.70019: Set connection var ansible_timeout to 10 30582 1726855306.70026: Set connection var ansible_connection to ssh 30582 1726855306.70041: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855306.70049: Set connection var ansible_pipelining to False 30582 1726855306.70057: Set connection var ansible_shell_executable to /bin/sh 30582 1726855306.70062: Set connection var ansible_shell_type to sh 30582 1726855306.70090: variable 'ansible_shell_executable' from source: unknown 30582 1726855306.70121: variable 'ansible_connection' from source: unknown 30582 1726855306.70124: variable 'ansible_module_compression' from source: unknown 30582 1726855306.70127: variable 'ansible_shell_type' from source: unknown 30582 1726855306.70128: variable 'ansible_shell_executable' from source: unknown 30582 1726855306.70131: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.70133: variable 'ansible_pipelining' from source: unknown 30582 1726855306.70135: variable 'ansible_timeout' from source: unknown 30582 1726855306.70231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.70307: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855306.70324: variable 'omit' from source: magic vars 30582 1726855306.70341: starting attempt loop 30582 1726855306.70347: running the handler 30582 1726855306.70530: variable '__network_connections_result' from source: set_fact 30582 1726855306.70641: handler run complete 30582 1726855306.70703: attempt loop complete, returning result 30582 1726855306.70710: _execute() done 30582 1726855306.70716: dumping result to json 30582 1726855306.70734: done dumping result, returning 30582 1726855306.70759: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000000d28] 30582 1726855306.70789: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d28 30582 1726855306.71000: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d28 30582 1726855306.71003: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508 skipped because already active" ] } 30582 1726855306.71073: no more pending results, returning what we have 30582 1726855306.71080: results queue empty 30582 1726855306.71081: checking for any_errors_fatal 30582 1726855306.71092: done checking for any_errors_fatal 30582 1726855306.71093: checking for max_fail_percentage 30582 1726855306.71193: done checking for max_fail_percentage 30582 1726855306.71195: checking to see if all hosts have failed and the running result is not ok 30582 1726855306.71196: done checking to see if all hosts have failed 30582 1726855306.71197: getting the remaining hosts for this loop 30582 1726855306.71199: done getting the remaining hosts for this loop 30582 1726855306.71205: getting the next task for host managed_node3 30582 1726855306.71215: done getting next task for host managed_node3 30582 1726855306.71220: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855306.71226: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855306.71239: getting variables 30582 1726855306.71241: in VariableManager get_vars() 30582 1726855306.71283: Calling all_inventory to load vars for managed_node3 30582 1726855306.71286: Calling groups_inventory to load vars for managed_node3 30582 1726855306.71406: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855306.71417: Calling all_plugins_play to load vars for managed_node3 30582 1726855306.71420: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855306.71424: Calling groups_plugins_play to load vars for managed_node3 30582 1726855306.73916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855306.77293: done with get_vars() 30582 1726855306.77503: done getting variables 30582 1726855306.77662: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:01:46 -0400 (0:00:00.099) 0:00:43.126 ****** 30582 1726855306.77709: entering _queue_task() for managed_node3/debug 30582 1726855306.78563: worker is 1 (out of 1 available) 30582 1726855306.78579: exiting _queue_task() for managed_node3/debug 30582 1726855306.78593: done queuing things up, now waiting for results queue to drain 30582 1726855306.78595: waiting for pending results... 30582 1726855306.79245: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855306.79505: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d29 30582 1726855306.79509: variable 'ansible_search_path' from source: unknown 30582 1726855306.79536: variable 'ansible_search_path' from source: unknown 30582 1726855306.79662: calling self._execute() 30582 1726855306.79717: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.79742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.79756: variable 'omit' from source: magic vars 30582 1726855306.80434: variable 'ansible_distribution_major_version' from source: facts 30582 1726855306.80453: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855306.80750: variable 'omit' from source: magic vars 30582 1726855306.80754: variable 'omit' from source: magic vars 30582 1726855306.80756: variable 'omit' from source: magic vars 30582 1726855306.80759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855306.80794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855306.80805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855306.80827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855306.80834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855306.80879: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855306.80883: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.80885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.80998: Set connection var ansible_timeout to 10 30582 1726855306.81001: Set connection var ansible_connection to ssh 30582 1726855306.81008: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855306.81013: Set connection var ansible_pipelining to False 30582 1726855306.81018: Set connection var ansible_shell_executable to /bin/sh 30582 1726855306.81021: Set connection var ansible_shell_type to sh 30582 1726855306.81045: variable 'ansible_shell_executable' from source: unknown 30582 1726855306.81048: variable 'ansible_connection' from source: unknown 30582 1726855306.81051: variable 'ansible_module_compression' from source: unknown 30582 1726855306.81053: variable 'ansible_shell_type' from source: unknown 30582 1726855306.81055: variable 'ansible_shell_executable' from source: unknown 30582 1726855306.81057: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.81059: variable 'ansible_pipelining' from source: unknown 30582 1726855306.81064: variable 'ansible_timeout' from source: unknown 30582 1726855306.81068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.81230: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855306.81314: variable 'omit' from source: magic vars 30582 1726855306.81317: starting attempt loop 30582 1726855306.81320: running the handler 30582 1726855306.81322: variable '__network_connections_result' from source: set_fact 30582 1726855306.81513: variable '__network_connections_result' from source: set_fact 30582 1726855306.81516: handler run complete 30582 1726855306.81519: attempt loop complete, returning result 30582 1726855306.81530: _execute() done 30582 1726855306.81533: dumping result to json 30582 1726855306.81535: done dumping result, returning 30582 1726855306.81538: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000000d29] 30582 1726855306.81545: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d29 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 7b764d37-80c8-473a-b5aa-e42b924ac508 skipped because already active" ] } } 30582 1726855306.81859: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d29 30582 1726855306.81863: WORKER PROCESS EXITING 30582 1726855306.81894: no more pending results, returning what we have 30582 1726855306.81898: results queue empty 30582 1726855306.81899: checking for any_errors_fatal 30582 1726855306.81906: done checking for any_errors_fatal 30582 1726855306.81907: checking for max_fail_percentage 30582 1726855306.81909: done checking for max_fail_percentage 30582 1726855306.81909: checking to see if all hosts have failed and the running result is not ok 30582 1726855306.81910: done checking to see if all hosts have failed 30582 1726855306.81911: getting the remaining hosts for this loop 30582 1726855306.81912: done getting the remaining hosts for this loop 30582 1726855306.81916: getting the next task for host managed_node3 30582 1726855306.81926: done getting next task for host managed_node3 30582 1726855306.81929: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855306.81934: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855306.81945: getting variables 30582 1726855306.81947: in VariableManager get_vars() 30582 1726855306.82100: Calling all_inventory to load vars for managed_node3 30582 1726855306.82103: Calling groups_inventory to load vars for managed_node3 30582 1726855306.82111: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855306.82119: Calling all_plugins_play to load vars for managed_node3 30582 1726855306.82122: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855306.82124: Calling groups_plugins_play to load vars for managed_node3 30582 1726855306.84649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855306.90844: done with get_vars() 30582 1726855306.90884: done getting variables 30582 1726855306.91108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:01:46 -0400 (0:00:00.134) 0:00:43.261 ****** 30582 1726855306.91199: entering _queue_task() for managed_node3/debug 30582 1726855306.92141: worker is 1 (out of 1 available) 30582 1726855306.92153: exiting _queue_task() for managed_node3/debug 30582 1726855306.92164: done queuing things up, now waiting for results queue to drain 30582 1726855306.92166: waiting for pending results... 30582 1726855306.92570: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855306.93122: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d2a 30582 1726855306.93202: variable 'ansible_search_path' from source: unknown 30582 1726855306.93207: variable 'ansible_search_path' from source: unknown 30582 1726855306.93279: calling self._execute() 30582 1726855306.93297: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855306.93300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855306.93495: variable 'omit' from source: magic vars 30582 1726855306.94285: variable 'ansible_distribution_major_version' from source: facts 30582 1726855306.94294: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855306.94634: variable 'network_state' from source: role '' defaults 30582 1726855306.94804: Evaluated conditional (network_state != {}): False 30582 1726855306.94808: when evaluation is False, skipping this task 30582 1726855306.94810: _execute() done 30582 1726855306.94816: dumping result to json 30582 1726855306.94823: done dumping result, returning 30582 1726855306.94826: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-000000000d2a] 30582 1726855306.94828: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d2a skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855306.95172: no more pending results, returning what we have 30582 1726855306.95176: results queue empty 30582 1726855306.95177: checking for any_errors_fatal 30582 1726855306.95204: done checking for any_errors_fatal 30582 1726855306.95205: checking for max_fail_percentage 30582 1726855306.95208: done checking for max_fail_percentage 30582 1726855306.95209: checking to see if all hosts have failed and the running result is not ok 30582 1726855306.95210: done checking to see if all hosts have failed 30582 1726855306.95211: getting the remaining hosts for this loop 30582 1726855306.95212: done getting the remaining hosts for this loop 30582 1726855306.95217: getting the next task for host managed_node3 30582 1726855306.95226: done getting next task for host managed_node3 30582 1726855306.95230: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855306.95254: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855306.95265: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d2a 30582 1726855306.95268: WORKER PROCESS EXITING 30582 1726855306.95283: getting variables 30582 1726855306.95285: in VariableManager get_vars() 30582 1726855306.95352: Calling all_inventory to load vars for managed_node3 30582 1726855306.95356: Calling groups_inventory to load vars for managed_node3 30582 1726855306.95358: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855306.95370: Calling all_plugins_play to load vars for managed_node3 30582 1726855306.95373: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855306.95376: Calling groups_plugins_play to load vars for managed_node3 30582 1726855306.97471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855307.17318: done with get_vars() 30582 1726855307.17661: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:01:47 -0400 (0:00:00.267) 0:00:43.529 ****** 30582 1726855307.17950: entering _queue_task() for managed_node3/ping 30582 1726855307.18995: worker is 1 (out of 1 available) 30582 1726855307.19006: exiting _queue_task() for managed_node3/ping 30582 1726855307.19017: done queuing things up, now waiting for results queue to drain 30582 1726855307.19019: waiting for pending results... 30582 1726855307.19208: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855307.19337: in run() - task 0affcc66-ac2b-aa83-7d57-000000000d2b 30582 1726855307.19394: variable 'ansible_search_path' from source: unknown 30582 1726855307.19398: variable 'ansible_search_path' from source: unknown 30582 1726855307.19428: calling self._execute() 30582 1726855307.19539: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855307.19586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855307.19592: variable 'omit' from source: magic vars 30582 1726855307.20000: variable 'ansible_distribution_major_version' from source: facts 30582 1726855307.20023: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855307.20095: variable 'omit' from source: magic vars 30582 1726855307.20109: variable 'omit' from source: magic vars 30582 1726855307.20159: variable 'omit' from source: magic vars 30582 1726855307.20204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855307.20251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855307.20276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855307.20323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855307.20327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855307.20369: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855307.20431: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855307.20435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855307.20520: Set connection var ansible_timeout to 10 30582 1726855307.20528: Set connection var ansible_connection to ssh 30582 1726855307.20554: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855307.20567: Set connection var ansible_pipelining to False 30582 1726855307.20577: Set connection var ansible_shell_executable to /bin/sh 30582 1726855307.20585: Set connection var ansible_shell_type to sh 30582 1726855307.20614: variable 'ansible_shell_executable' from source: unknown 30582 1726855307.20625: variable 'ansible_connection' from source: unknown 30582 1726855307.20647: variable 'ansible_module_compression' from source: unknown 30582 1726855307.20650: variable 'ansible_shell_type' from source: unknown 30582 1726855307.20652: variable 'ansible_shell_executable' from source: unknown 30582 1726855307.20654: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855307.20693: variable 'ansible_pipelining' from source: unknown 30582 1726855307.20696: variable 'ansible_timeout' from source: unknown 30582 1726855307.20698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855307.21013: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855307.21018: variable 'omit' from source: magic vars 30582 1726855307.21021: starting attempt loop 30582 1726855307.21023: running the handler 30582 1726855307.21025: _low_level_execute_command(): starting 30582 1726855307.21027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855307.21730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855307.21748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855307.21772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855307.21798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855307.21901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855307.21924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855307.21946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855307.22127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855307.23819: stdout chunk (state=3): >>>/root <<< 30582 1726855307.23955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855307.23977: stdout chunk (state=3): >>><<< 30582 1726855307.23994: stderr chunk (state=3): >>><<< 30582 1726855307.24025: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855307.24048: _low_level_execute_command(): starting 30582 1726855307.24061: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699 `" && echo ansible-tmp-1726855307.2403367-32599-58611571790699="` echo /root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699 `" ) && sleep 0' 30582 1726855307.24685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855307.24702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855307.24715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855307.24731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855307.24746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855307.24763: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855307.24777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855307.24797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855307.24883: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855307.24903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855307.24918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855307.25012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855307.26964: stdout chunk (state=3): >>>ansible-tmp-1726855307.2403367-32599-58611571790699=/root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699 <<< 30582 1726855307.27290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855307.27294: stdout chunk (state=3): >>><<< 30582 1726855307.27312: stderr chunk (state=3): >>><<< 30582 1726855307.27318: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855307.2403367-32599-58611571790699=/root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855307.27625: variable 'ansible_module_compression' from source: unknown 30582 1726855307.27666: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855307.27792: variable 'ansible_facts' from source: unknown 30582 1726855307.27799: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/AnsiballZ_ping.py 30582 1726855307.28079: Sending initial data 30582 1726855307.28083: Sent initial data (152 bytes) 30582 1726855307.28743: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855307.28803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855307.28856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855307.28875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855307.28893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855307.28985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855307.30820: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855307.30893: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855307.31093: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp2d5cvvqu /root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/AnsiballZ_ping.py <<< 30582 1726855307.31104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/AnsiballZ_ping.py" <<< 30582 1726855307.31184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp2d5cvvqu" to remote "/root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/AnsiballZ_ping.py" <<< 30582 1726855307.32405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855307.32448: stderr chunk (state=3): >>><<< 30582 1726855307.32460: stdout chunk (state=3): >>><<< 30582 1726855307.32492: done transferring module to remote 30582 1726855307.32502: _low_level_execute_command(): starting 30582 1726855307.32507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/ /root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/AnsiballZ_ping.py && sleep 0' 30582 1726855307.33245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855307.33365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855307.33368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855307.33371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855307.33426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855307.35572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855307.35575: stdout chunk (state=3): >>><<< 30582 1726855307.35578: stderr chunk (state=3): >>><<< 30582 1726855307.35581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855307.35583: _low_level_execute_command(): starting 30582 1726855307.35585: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/AnsiballZ_ping.py && sleep 0' 30582 1726855307.36765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855307.36994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855307.37010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855307.37021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855307.37124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855307.52196: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855307.53545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855307.53551: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855307.53604: stderr chunk (state=3): >>><<< 30582 1726855307.53611: stdout chunk (state=3): >>><<< 30582 1726855307.53636: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855307.53660: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855307.53670: _low_level_execute_command(): starting 30582 1726855307.53675: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855307.2403367-32599-58611571790699/ > /dev/null 2>&1 && sleep 0' 30582 1726855307.54521: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855307.54692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855307.55056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855307.55149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855307.56996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855307.57192: stderr chunk (state=3): >>><<< 30582 1726855307.57195: stdout chunk (state=3): >>><<< 30582 1726855307.57198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855307.57205: handler run complete 30582 1726855307.57207: attempt loop complete, returning result 30582 1726855307.57208: _execute() done 30582 1726855307.57210: dumping result to json 30582 1726855307.57212: done dumping result, returning 30582 1726855307.57213: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-000000000d2b] 30582 1726855307.57215: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d2b 30582 1726855307.57273: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000d2b 30582 1726855307.57276: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855307.57349: no more pending results, returning what we have 30582 1726855307.57353: results queue empty 30582 1726855307.57354: checking for any_errors_fatal 30582 1726855307.57364: done checking for any_errors_fatal 30582 1726855307.57365: checking for max_fail_percentage 30582 1726855307.57367: done checking for max_fail_percentage 30582 1726855307.57368: checking to see if all hosts have failed and the running result is not ok 30582 1726855307.57369: done checking to see if all hosts have failed 30582 1726855307.57370: getting the remaining hosts for this loop 30582 1726855307.57373: done getting the remaining hosts for this loop 30582 1726855307.57377: getting the next task for host managed_node3 30582 1726855307.57393: done getting next task for host managed_node3 30582 1726855307.57395: ^ task is: TASK: meta (role_complete) 30582 1726855307.57406: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855307.57423: getting variables 30582 1726855307.57425: in VariableManager get_vars() 30582 1726855307.57467: Calling all_inventory to load vars for managed_node3 30582 1726855307.57470: Calling groups_inventory to load vars for managed_node3 30582 1726855307.57473: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855307.57484: Calling all_plugins_play to load vars for managed_node3 30582 1726855307.57593: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855307.57694: Calling groups_plugins_play to load vars for managed_node3 30582 1726855307.61094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855307.63095: done with get_vars() 30582 1726855307.63129: done getting variables 30582 1726855307.63226: done queuing things up, now waiting for results queue to drain 30582 1726855307.63228: results queue empty 30582 1726855307.63229: checking for any_errors_fatal 30582 1726855307.63232: done checking for any_errors_fatal 30582 1726855307.63233: checking for max_fail_percentage 30582 1726855307.63234: done checking for max_fail_percentage 30582 1726855307.63235: checking to see if all hosts have failed and the running result is not ok 30582 1726855307.63236: done checking to see if all hosts have failed 30582 1726855307.63236: getting the remaining hosts for this loop 30582 1726855307.63237: done getting the remaining hosts for this loop 30582 1726855307.63242: getting the next task for host managed_node3 30582 1726855307.63249: done getting next task for host managed_node3 30582 1726855307.63251: ^ task is: TASK: Asserts 30582 1726855307.63253: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855307.63256: getting variables 30582 1726855307.63257: in VariableManager get_vars() 30582 1726855307.63268: Calling all_inventory to load vars for managed_node3 30582 1726855307.63270: Calling groups_inventory to load vars for managed_node3 30582 1726855307.63273: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855307.63278: Calling all_plugins_play to load vars for managed_node3 30582 1726855307.63280: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855307.63282: Calling groups_plugins_play to load vars for managed_node3 30582 1726855307.64629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855307.66864: done with get_vars() 30582 1726855307.66893: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 14:01:47 -0400 (0:00:00.490) 0:00:44.019 ****** 30582 1726855307.66974: entering _queue_task() for managed_node3/include_tasks 30582 1726855307.67433: worker is 1 (out of 1 available) 30582 1726855307.67445: exiting _queue_task() for managed_node3/include_tasks 30582 1726855307.67458: done queuing things up, now waiting for results queue to drain 30582 1726855307.67459: waiting for pending results... 30582 1726855307.67818: running TaskExecutor() for managed_node3/TASK: Asserts 30582 1726855307.67966: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a4e 30582 1726855307.68021: variable 'ansible_search_path' from source: unknown 30582 1726855307.68030: variable 'ansible_search_path' from source: unknown 30582 1726855307.68053: variable 'lsr_assert' from source: include params 30582 1726855307.68311: variable 'lsr_assert' from source: include params 30582 1726855307.68394: variable 'omit' from source: magic vars 30582 1726855307.68566: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855307.68570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855307.68580: variable 'omit' from source: magic vars 30582 1726855307.68821: variable 'ansible_distribution_major_version' from source: facts 30582 1726855307.68894: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855307.68898: variable 'item' from source: unknown 30582 1726855307.68919: variable 'item' from source: unknown 30582 1726855307.69195: variable 'item' from source: unknown 30582 1726855307.69199: variable 'item' from source: unknown 30582 1726855307.69528: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855307.69532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855307.69535: variable 'omit' from source: magic vars 30582 1726855307.69613: variable 'ansible_distribution_major_version' from source: facts 30582 1726855307.69624: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855307.69641: variable 'item' from source: unknown 30582 1726855307.69703: variable 'item' from source: unknown 30582 1726855307.69743: variable 'item' from source: unknown 30582 1726855307.69811: variable 'item' from source: unknown 30582 1726855307.70005: dumping result to json 30582 1726855307.70009: done dumping result, returning 30582 1726855307.70011: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcc66-ac2b-aa83-7d57-000000000a4e] 30582 1726855307.70014: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4e 30582 1726855307.70059: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4e 30582 1726855307.70109: WORKER PROCESS EXITING 30582 1726855307.70138: no more pending results, returning what we have 30582 1726855307.70145: in VariableManager get_vars() 30582 1726855307.70195: Calling all_inventory to load vars for managed_node3 30582 1726855307.70198: Calling groups_inventory to load vars for managed_node3 30582 1726855307.70202: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855307.70217: Calling all_plugins_play to load vars for managed_node3 30582 1726855307.70222: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855307.70225: Calling groups_plugins_play to load vars for managed_node3 30582 1726855307.71877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855307.73602: done with get_vars() 30582 1726855307.73643: variable 'ansible_search_path' from source: unknown 30582 1726855307.73644: variable 'ansible_search_path' from source: unknown 30582 1726855307.73700: variable 'ansible_search_path' from source: unknown 30582 1726855307.73701: variable 'ansible_search_path' from source: unknown 30582 1726855307.73737: we have included files to process 30582 1726855307.73744: generating all_blocks data 30582 1726855307.73746: done generating all_blocks data 30582 1726855307.73755: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30582 1726855307.73756: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30582 1726855307.73759: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30582 1726855307.73902: in VariableManager get_vars() 30582 1726855307.73925: done with get_vars() 30582 1726855307.74053: done processing included file 30582 1726855307.74056: iterating over new_blocks loaded from include file 30582 1726855307.74057: in VariableManager get_vars() 30582 1726855307.74079: done with get_vars() 30582 1726855307.74081: filtering new block on tags 30582 1726855307.74129: done filtering new block on tags 30582 1726855307.74132: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 => (item=tasks/assert_device_present.yml) 30582 1726855307.74137: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30582 1726855307.74138: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30582 1726855307.74142: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30582 1726855307.74259: in VariableManager get_vars() 30582 1726855307.74281: done with get_vars() 30582 1726855307.74559: done processing included file 30582 1726855307.74561: iterating over new_blocks loaded from include file 30582 1726855307.74562: in VariableManager get_vars() 30582 1726855307.74581: done with get_vars() 30582 1726855307.74583: filtering new block on tags 30582 1726855307.74641: done filtering new block on tags 30582 1726855307.74647: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=tasks/assert_profile_present.yml) 30582 1726855307.74651: extending task lists for all hosts with included blocks 30582 1726855307.75768: done extending task lists 30582 1726855307.75770: done processing included files 30582 1726855307.75770: results queue empty 30582 1726855307.75771: checking for any_errors_fatal 30582 1726855307.75776: done checking for any_errors_fatal 30582 1726855307.75777: checking for max_fail_percentage 30582 1726855307.75778: done checking for max_fail_percentage 30582 1726855307.75779: checking to see if all hosts have failed and the running result is not ok 30582 1726855307.75780: done checking to see if all hosts have failed 30582 1726855307.75781: getting the remaining hosts for this loop 30582 1726855307.75782: done getting the remaining hosts for this loop 30582 1726855307.75784: getting the next task for host managed_node3 30582 1726855307.75790: done getting next task for host managed_node3 30582 1726855307.75793: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30582 1726855307.75796: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855307.75805: getting variables 30582 1726855307.75806: in VariableManager get_vars() 30582 1726855307.75815: Calling all_inventory to load vars for managed_node3 30582 1726855307.75818: Calling groups_inventory to load vars for managed_node3 30582 1726855307.75821: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855307.75827: Calling all_plugins_play to load vars for managed_node3 30582 1726855307.75835: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855307.75838: Calling groups_plugins_play to load vars for managed_node3 30582 1726855307.77069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855307.78685: done with get_vars() 30582 1726855307.78714: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 14:01:47 -0400 (0:00:00.118) 0:00:44.137 ****** 30582 1726855307.78796: entering _queue_task() for managed_node3/include_tasks 30582 1726855307.79170: worker is 1 (out of 1 available) 30582 1726855307.79183: exiting _queue_task() for managed_node3/include_tasks 30582 1726855307.79200: done queuing things up, now waiting for results queue to drain 30582 1726855307.79202: waiting for pending results... 30582 1726855307.79614: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30582 1726855307.79742: in run() - task 0affcc66-ac2b-aa83-7d57-000000000e86 30582 1726855307.79778: variable 'ansible_search_path' from source: unknown 30582 1726855307.79867: variable 'ansible_search_path' from source: unknown 30582 1726855307.79877: calling self._execute() 30582 1726855307.79958: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855307.79978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855307.80003: variable 'omit' from source: magic vars 30582 1726855307.80481: variable 'ansible_distribution_major_version' from source: facts 30582 1726855307.80540: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855307.80544: _execute() done 30582 1726855307.80551: dumping result to json 30582 1726855307.80555: done dumping result, returning 30582 1726855307.80563: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-aa83-7d57-000000000e86] 30582 1726855307.80567: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e86 30582 1726855307.80842: no more pending results, returning what we have 30582 1726855307.80851: in VariableManager get_vars() 30582 1726855307.80902: Calling all_inventory to load vars for managed_node3 30582 1726855307.80906: Calling groups_inventory to load vars for managed_node3 30582 1726855307.80913: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855307.80920: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e86 30582 1726855307.80928: WORKER PROCESS EXITING 30582 1726855307.81110: Calling all_plugins_play to load vars for managed_node3 30582 1726855307.81113: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855307.81117: Calling groups_plugins_play to load vars for managed_node3 30582 1726855307.83177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855307.85336: done with get_vars() 30582 1726855307.85363: variable 'ansible_search_path' from source: unknown 30582 1726855307.85364: variable 'ansible_search_path' from source: unknown 30582 1726855307.85396: variable 'item' from source: include params 30582 1726855307.85518: variable 'item' from source: include params 30582 1726855307.85558: we have included files to process 30582 1726855307.85559: generating all_blocks data 30582 1726855307.85561: done generating all_blocks data 30582 1726855307.85562: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855307.85563: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855307.85567: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855307.85845: done processing included file 30582 1726855307.85847: iterating over new_blocks loaded from include file 30582 1726855307.85849: in VariableManager get_vars() 30582 1726855307.85909: done with get_vars() 30582 1726855307.85912: filtering new block on tags 30582 1726855307.85958: done filtering new block on tags 30582 1726855307.85962: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30582 1726855307.85967: extending task lists for all hosts with included blocks 30582 1726855307.86280: done extending task lists 30582 1726855307.86281: done processing included files 30582 1726855307.86282: results queue empty 30582 1726855307.86283: checking for any_errors_fatal 30582 1726855307.86289: done checking for any_errors_fatal 30582 1726855307.86290: checking for max_fail_percentage 30582 1726855307.86292: done checking for max_fail_percentage 30582 1726855307.86292: checking to see if all hosts have failed and the running result is not ok 30582 1726855307.86293: done checking to see if all hosts have failed 30582 1726855307.86294: getting the remaining hosts for this loop 30582 1726855307.86296: done getting the remaining hosts for this loop 30582 1726855307.86299: getting the next task for host managed_node3 30582 1726855307.86303: done getting next task for host managed_node3 30582 1726855307.86306: ^ task is: TASK: Get stat for interface {{ interface }} 30582 1726855307.86313: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855307.86316: getting variables 30582 1726855307.86321: in VariableManager get_vars() 30582 1726855307.86333: Calling all_inventory to load vars for managed_node3 30582 1726855307.86335: Calling groups_inventory to load vars for managed_node3 30582 1726855307.86338: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855307.86344: Calling all_plugins_play to load vars for managed_node3 30582 1726855307.86346: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855307.86354: Calling groups_plugins_play to load vars for managed_node3 30582 1726855307.87922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855307.89700: done with get_vars() 30582 1726855307.89735: done getting variables 30582 1726855307.90019: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 14:01:47 -0400 (0:00:00.112) 0:00:44.250 ****** 30582 1726855307.90094: entering _queue_task() for managed_node3/stat 30582 1726855307.90767: worker is 1 (out of 1 available) 30582 1726855307.90791: exiting _queue_task() for managed_node3/stat 30582 1726855307.90809: done queuing things up, now waiting for results queue to drain 30582 1726855307.90811: waiting for pending results... 30582 1726855307.91281: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30582 1726855307.91414: in run() - task 0affcc66-ac2b-aa83-7d57-000000000ef5 30582 1726855307.91427: variable 'ansible_search_path' from source: unknown 30582 1726855307.91430: variable 'ansible_search_path' from source: unknown 30582 1726855307.91469: calling self._execute() 30582 1726855307.91805: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855307.91809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855307.91816: variable 'omit' from source: magic vars 30582 1726855307.92593: variable 'ansible_distribution_major_version' from source: facts 30582 1726855307.92597: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855307.92599: variable 'omit' from source: magic vars 30582 1726855307.92602: variable 'omit' from source: magic vars 30582 1726855307.92605: variable 'interface' from source: play vars 30582 1726855307.92607: variable 'omit' from source: magic vars 30582 1726855307.92609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855307.92673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855307.92719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855307.92767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855307.92777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855307.92824: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855307.92827: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855307.92830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855307.93092: Set connection var ansible_timeout to 10 30582 1726855307.93096: Set connection var ansible_connection to ssh 30582 1726855307.93098: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855307.93101: Set connection var ansible_pipelining to False 30582 1726855307.93103: Set connection var ansible_shell_executable to /bin/sh 30582 1726855307.93105: Set connection var ansible_shell_type to sh 30582 1726855307.93107: variable 'ansible_shell_executable' from source: unknown 30582 1726855307.93109: variable 'ansible_connection' from source: unknown 30582 1726855307.93111: variable 'ansible_module_compression' from source: unknown 30582 1726855307.93113: variable 'ansible_shell_type' from source: unknown 30582 1726855307.93116: variable 'ansible_shell_executable' from source: unknown 30582 1726855307.93118: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855307.93119: variable 'ansible_pipelining' from source: unknown 30582 1726855307.93122: variable 'ansible_timeout' from source: unknown 30582 1726855307.93124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855307.93269: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855307.93282: variable 'omit' from source: magic vars 30582 1726855307.93290: starting attempt loop 30582 1726855307.93294: running the handler 30582 1726855307.93313: _low_level_execute_command(): starting 30582 1726855307.93320: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855307.94596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855307.94643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855307.94655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855307.94682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855307.94776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855307.96599: stdout chunk (state=3): >>>/root <<< 30582 1726855307.96788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855307.96797: stdout chunk (state=3): >>><<< 30582 1726855307.96800: stderr chunk (state=3): >>><<< 30582 1726855307.96804: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855307.96806: _low_level_execute_command(): starting 30582 1726855307.96810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273 `" && echo ansible-tmp-1726855307.9672902-32637-8921100508273="` echo /root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273 `" ) && sleep 0' 30582 1726855307.98418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855307.98607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855307.98770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855307.98831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855307.99065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.01091: stdout chunk (state=3): >>>ansible-tmp-1726855307.9672902-32637-8921100508273=/root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273 <<< 30582 1726855308.01111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855308.01167: stderr chunk (state=3): >>><<< 30582 1726855308.01190: stdout chunk (state=3): >>><<< 30582 1726855308.01221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855307.9672902-32637-8921100508273=/root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855308.01278: variable 'ansible_module_compression' from source: unknown 30582 1726855308.01361: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855308.01415: variable 'ansible_facts' from source: unknown 30582 1726855308.01521: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/AnsiballZ_stat.py 30582 1726855308.01755: Sending initial data 30582 1726855308.01762: Sent initial data (151 bytes) 30582 1726855308.02427: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855308.02453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855308.02474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855308.02570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855308.02612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855308.02635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.02792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.02942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.04511: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855308.04537: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855308.04640: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855308.04754: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpsqeirfb4 /root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/AnsiballZ_stat.py <<< 30582 1726855308.04765: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/AnsiballZ_stat.py" <<< 30582 1726855308.04827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30582 1726855308.04852: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpsqeirfb4" to remote "/root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/AnsiballZ_stat.py" <<< 30582 1726855308.05999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855308.06003: stdout chunk (state=3): >>><<< 30582 1726855308.06005: stderr chunk (state=3): >>><<< 30582 1726855308.06007: done transferring module to remote 30582 1726855308.06010: _low_level_execute_command(): starting 30582 1726855308.06012: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/ /root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/AnsiballZ_stat.py && sleep 0' 30582 1726855308.06603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855308.06617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855308.06629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855308.06645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855308.06750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855308.06776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.06796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.06891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.08723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855308.08736: stdout chunk (state=3): >>><<< 30582 1726855308.08749: stderr chunk (state=3): >>><<< 30582 1726855308.08772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855308.08782: _low_level_execute_command(): starting 30582 1726855308.08795: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/AnsiballZ_stat.py && sleep 0' 30582 1726855308.09405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855308.09421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855308.09438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855308.09457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855308.09473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855308.09502: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855308.09515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855308.09591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.09615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.09699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.25125: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32367, "dev": 23, "nlink": 1, "atime": 1726855299.1073089, "mtime": 1726855299.1073089, "ctime": 1726855299.1073089, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855308.26381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855308.26385: stdout chunk (state=3): >>><<< 30582 1726855308.26592: stderr chunk (state=3): >>><<< 30582 1726855308.26598: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32367, "dev": 23, "nlink": 1, "atime": 1726855299.1073089, "mtime": 1726855299.1073089, "ctime": 1726855299.1073089, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855308.26600: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855308.26603: _low_level_execute_command(): starting 30582 1726855308.26605: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855307.9672902-32637-8921100508273/ > /dev/null 2>&1 && sleep 0' 30582 1726855308.27529: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855308.27801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855308.27883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.27905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.28003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.29846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855308.29910: stderr chunk (state=3): >>><<< 30582 1726855308.29918: stdout chunk (state=3): >>><<< 30582 1726855308.29938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855308.29949: handler run complete 30582 1726855308.30004: attempt loop complete, returning result 30582 1726855308.30010: _execute() done 30582 1726855308.30016: dumping result to json 30582 1726855308.30025: done dumping result, returning 30582 1726855308.30034: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcc66-ac2b-aa83-7d57-000000000ef5] 30582 1726855308.30042: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000ef5 ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726855299.1073089, "block_size": 4096, "blocks": 0, "ctime": 1726855299.1073089, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32367, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726855299.1073089, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30582 1726855308.30276: no more pending results, returning what we have 30582 1726855308.30279: results queue empty 30582 1726855308.30280: checking for any_errors_fatal 30582 1726855308.30282: done checking for any_errors_fatal 30582 1726855308.30283: checking for max_fail_percentage 30582 1726855308.30285: done checking for max_fail_percentage 30582 1726855308.30286: checking to see if all hosts have failed and the running result is not ok 30582 1726855308.30286: done checking to see if all hosts have failed 30582 1726855308.30289: getting the remaining hosts for this loop 30582 1726855308.30290: done getting the remaining hosts for this loop 30582 1726855308.30295: getting the next task for host managed_node3 30582 1726855308.30306: done getting next task for host managed_node3 30582 1726855308.30308: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30582 1726855308.30312: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855308.30317: getting variables 30582 1726855308.30319: in VariableManager get_vars() 30582 1726855308.30353: Calling all_inventory to load vars for managed_node3 30582 1726855308.30355: Calling groups_inventory to load vars for managed_node3 30582 1726855308.30358: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855308.30371: Calling all_plugins_play to load vars for managed_node3 30582 1726855308.30374: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855308.30377: Calling groups_plugins_play to load vars for managed_node3 30582 1726855308.30903: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000ef5 30582 1726855308.30907: WORKER PROCESS EXITING 30582 1726855308.32005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855308.34069: done with get_vars() 30582 1726855308.34100: done getting variables 30582 1726855308.34158: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855308.34281: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 14:01:48 -0400 (0:00:00.442) 0:00:44.693 ****** 30582 1726855308.34314: entering _queue_task() for managed_node3/assert 30582 1726855308.34658: worker is 1 (out of 1 available) 30582 1726855308.34672: exiting _queue_task() for managed_node3/assert 30582 1726855308.34686: done queuing things up, now waiting for results queue to drain 30582 1726855308.34791: waiting for pending results... 30582 1726855308.35007: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' 30582 1726855308.35294: in run() - task 0affcc66-ac2b-aa83-7d57-000000000e87 30582 1726855308.35299: variable 'ansible_search_path' from source: unknown 30582 1726855308.35302: variable 'ansible_search_path' from source: unknown 30582 1726855308.35306: calling self._execute() 30582 1726855308.35308: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.35311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.35313: variable 'omit' from source: magic vars 30582 1726855308.35688: variable 'ansible_distribution_major_version' from source: facts 30582 1726855308.35708: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855308.35721: variable 'omit' from source: magic vars 30582 1726855308.35776: variable 'omit' from source: magic vars 30582 1726855308.35888: variable 'interface' from source: play vars 30582 1726855308.35913: variable 'omit' from source: magic vars 30582 1726855308.35956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855308.36006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855308.36034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855308.36056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855308.36072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855308.36116: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855308.36124: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.36132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.36251: Set connection var ansible_timeout to 10 30582 1726855308.36258: Set connection var ansible_connection to ssh 30582 1726855308.36271: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855308.36284: Set connection var ansible_pipelining to False 30582 1726855308.36300: Set connection var ansible_shell_executable to /bin/sh 30582 1726855308.36303: Set connection var ansible_shell_type to sh 30582 1726855308.36410: variable 'ansible_shell_executable' from source: unknown 30582 1726855308.36413: variable 'ansible_connection' from source: unknown 30582 1726855308.36416: variable 'ansible_module_compression' from source: unknown 30582 1726855308.36418: variable 'ansible_shell_type' from source: unknown 30582 1726855308.36420: variable 'ansible_shell_executable' from source: unknown 30582 1726855308.36422: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.36424: variable 'ansible_pipelining' from source: unknown 30582 1726855308.36426: variable 'ansible_timeout' from source: unknown 30582 1726855308.36428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.36531: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855308.36548: variable 'omit' from source: magic vars 30582 1726855308.36558: starting attempt loop 30582 1726855308.36565: running the handler 30582 1726855308.36715: variable 'interface_stat' from source: set_fact 30582 1726855308.36744: Evaluated conditional (interface_stat.stat.exists): True 30582 1726855308.36755: handler run complete 30582 1726855308.36772: attempt loop complete, returning result 30582 1726855308.36783: _execute() done 30582 1726855308.36792: dumping result to json 30582 1726855308.36801: done dumping result, returning 30582 1726855308.36845: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' [0affcc66-ac2b-aa83-7d57-000000000e87] 30582 1726855308.36849: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e87 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855308.37005: no more pending results, returning what we have 30582 1726855308.37009: results queue empty 30582 1726855308.37010: checking for any_errors_fatal 30582 1726855308.37020: done checking for any_errors_fatal 30582 1726855308.37021: checking for max_fail_percentage 30582 1726855308.37024: done checking for max_fail_percentage 30582 1726855308.37025: checking to see if all hosts have failed and the running result is not ok 30582 1726855308.37026: done checking to see if all hosts have failed 30582 1726855308.37027: getting the remaining hosts for this loop 30582 1726855308.37029: done getting the remaining hosts for this loop 30582 1726855308.37033: getting the next task for host managed_node3 30582 1726855308.37045: done getting next task for host managed_node3 30582 1726855308.37048: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30582 1726855308.37053: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855308.37058: getting variables 30582 1726855308.37060: in VariableManager get_vars() 30582 1726855308.37105: Calling all_inventory to load vars for managed_node3 30582 1726855308.37108: Calling groups_inventory to load vars for managed_node3 30582 1726855308.37112: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855308.37124: Calling all_plugins_play to load vars for managed_node3 30582 1726855308.37128: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855308.37131: Calling groups_plugins_play to load vars for managed_node3 30582 1726855308.37801: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e87 30582 1726855308.37805: WORKER PROCESS EXITING 30582 1726855308.39009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855308.42221: done with get_vars() 30582 1726855308.42252: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 14:01:48 -0400 (0:00:00.081) 0:00:44.774 ****** 30582 1726855308.42463: entering _queue_task() for managed_node3/include_tasks 30582 1726855308.43110: worker is 1 (out of 1 available) 30582 1726855308.43123: exiting _queue_task() for managed_node3/include_tasks 30582 1726855308.43136: done queuing things up, now waiting for results queue to drain 30582 1726855308.43138: waiting for pending results... 30582 1726855308.43430: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30582 1726855308.43568: in run() - task 0affcc66-ac2b-aa83-7d57-000000000e8b 30582 1726855308.43595: variable 'ansible_search_path' from source: unknown 30582 1726855308.43604: variable 'ansible_search_path' from source: unknown 30582 1726855308.43694: calling self._execute() 30582 1726855308.43762: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.43778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.43796: variable 'omit' from source: magic vars 30582 1726855308.44196: variable 'ansible_distribution_major_version' from source: facts 30582 1726855308.44215: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855308.44225: _execute() done 30582 1726855308.44234: dumping result to json 30582 1726855308.44267: done dumping result, returning 30582 1726855308.44270: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-aa83-7d57-000000000e8b] 30582 1726855308.44275: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e8b 30582 1726855308.44408: no more pending results, returning what we have 30582 1726855308.44414: in VariableManager get_vars() 30582 1726855308.44459: Calling all_inventory to load vars for managed_node3 30582 1726855308.44462: Calling groups_inventory to load vars for managed_node3 30582 1726855308.44466: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855308.44485: Calling all_plugins_play to load vars for managed_node3 30582 1726855308.44491: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855308.44495: Calling groups_plugins_play to load vars for managed_node3 30582 1726855308.45959: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e8b 30582 1726855308.45962: WORKER PROCESS EXITING 30582 1726855308.46584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855308.48089: done with get_vars() 30582 1726855308.48114: variable 'ansible_search_path' from source: unknown 30582 1726855308.48115: variable 'ansible_search_path' from source: unknown 30582 1726855308.48125: variable 'item' from source: include params 30582 1726855308.48226: variable 'item' from source: include params 30582 1726855308.48259: we have included files to process 30582 1726855308.48260: generating all_blocks data 30582 1726855308.48263: done generating all_blocks data 30582 1726855308.48268: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855308.48269: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855308.48271: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855308.49230: done processing included file 30582 1726855308.49232: iterating over new_blocks loaded from include file 30582 1726855308.49233: in VariableManager get_vars() 30582 1726855308.49248: done with get_vars() 30582 1726855308.49250: filtering new block on tags 30582 1726855308.49316: done filtering new block on tags 30582 1726855308.49320: in VariableManager get_vars() 30582 1726855308.49335: done with get_vars() 30582 1726855308.49337: filtering new block on tags 30582 1726855308.49399: done filtering new block on tags 30582 1726855308.49402: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30582 1726855308.49408: extending task lists for all hosts with included blocks 30582 1726855308.49981: done extending task lists 30582 1726855308.49983: done processing included files 30582 1726855308.49984: results queue empty 30582 1726855308.49985: checking for any_errors_fatal 30582 1726855308.50218: done checking for any_errors_fatal 30582 1726855308.50220: checking for max_fail_percentage 30582 1726855308.50221: done checking for max_fail_percentage 30582 1726855308.50222: checking to see if all hosts have failed and the running result is not ok 30582 1726855308.50223: done checking to see if all hosts have failed 30582 1726855308.50223: getting the remaining hosts for this loop 30582 1726855308.50225: done getting the remaining hosts for this loop 30582 1726855308.50228: getting the next task for host managed_node3 30582 1726855308.50233: done getting next task for host managed_node3 30582 1726855308.50235: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855308.50238: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855308.50240: getting variables 30582 1726855308.50241: in VariableManager get_vars() 30582 1726855308.50252: Calling all_inventory to load vars for managed_node3 30582 1726855308.50254: Calling groups_inventory to load vars for managed_node3 30582 1726855308.50256: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855308.50263: Calling all_plugins_play to load vars for managed_node3 30582 1726855308.50265: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855308.50268: Calling groups_plugins_play to load vars for managed_node3 30582 1726855308.51727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855308.54677: done with get_vars() 30582 1726855308.54703: done getting variables 30582 1726855308.54752: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 14:01:48 -0400 (0:00:00.123) 0:00:44.898 ****** 30582 1726855308.54861: entering _queue_task() for managed_node3/set_fact 30582 1726855308.55555: worker is 1 (out of 1 available) 30582 1726855308.55566: exiting _queue_task() for managed_node3/set_fact 30582 1726855308.55580: done queuing things up, now waiting for results queue to drain 30582 1726855308.55582: waiting for pending results... 30582 1726855308.55939: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855308.56168: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f13 30582 1726855308.56270: variable 'ansible_search_path' from source: unknown 30582 1726855308.56279: variable 'ansible_search_path' from source: unknown 30582 1726855308.56324: calling self._execute() 30582 1726855308.56454: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.56618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.56635: variable 'omit' from source: magic vars 30582 1726855308.57220: variable 'ansible_distribution_major_version' from source: facts 30582 1726855308.57238: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855308.57250: variable 'omit' from source: magic vars 30582 1726855308.57310: variable 'omit' from source: magic vars 30582 1726855308.57350: variable 'omit' from source: magic vars 30582 1726855308.57404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855308.57444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855308.57477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855308.57506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855308.57527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855308.57563: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855308.57572: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.57584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.57701: Set connection var ansible_timeout to 10 30582 1726855308.57711: Set connection var ansible_connection to ssh 30582 1726855308.57724: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855308.57735: Set connection var ansible_pipelining to False 30582 1726855308.57791: Set connection var ansible_shell_executable to /bin/sh 30582 1726855308.57794: Set connection var ansible_shell_type to sh 30582 1726855308.57800: variable 'ansible_shell_executable' from source: unknown 30582 1726855308.57802: variable 'ansible_connection' from source: unknown 30582 1726855308.57805: variable 'ansible_module_compression' from source: unknown 30582 1726855308.57806: variable 'ansible_shell_type' from source: unknown 30582 1726855308.57808: variable 'ansible_shell_executable' from source: unknown 30582 1726855308.57810: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.57812: variable 'ansible_pipelining' from source: unknown 30582 1726855308.57814: variable 'ansible_timeout' from source: unknown 30582 1726855308.57815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.57941: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855308.57959: variable 'omit' from source: magic vars 30582 1726855308.57968: starting attempt loop 30582 1726855308.57974: running the handler 30582 1726855308.57991: handler run complete 30582 1726855308.58017: attempt loop complete, returning result 30582 1726855308.58020: _execute() done 30582 1726855308.58021: dumping result to json 30582 1726855308.58023: done dumping result, returning 30582 1726855308.58093: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-aa83-7d57-000000000f13] 30582 1726855308.58096: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f13 30582 1726855308.58164: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f13 30582 1726855308.58168: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30582 1726855308.58223: no more pending results, returning what we have 30582 1726855308.58227: results queue empty 30582 1726855308.58228: checking for any_errors_fatal 30582 1726855308.58229: done checking for any_errors_fatal 30582 1726855308.58230: checking for max_fail_percentage 30582 1726855308.58232: done checking for max_fail_percentage 30582 1726855308.58233: checking to see if all hosts have failed and the running result is not ok 30582 1726855308.58234: done checking to see if all hosts have failed 30582 1726855308.58234: getting the remaining hosts for this loop 30582 1726855308.58236: done getting the remaining hosts for this loop 30582 1726855308.58240: getting the next task for host managed_node3 30582 1726855308.58249: done getting next task for host managed_node3 30582 1726855308.58251: ^ task is: TASK: Stat profile file 30582 1726855308.58257: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855308.58260: getting variables 30582 1726855308.58263: in VariableManager get_vars() 30582 1726855308.58498: Calling all_inventory to load vars for managed_node3 30582 1726855308.58501: Calling groups_inventory to load vars for managed_node3 30582 1726855308.58503: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855308.58513: Calling all_plugins_play to load vars for managed_node3 30582 1726855308.58516: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855308.58519: Calling groups_plugins_play to load vars for managed_node3 30582 1726855308.60099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855308.62040: done with get_vars() 30582 1726855308.62066: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 14:01:48 -0400 (0:00:00.073) 0:00:44.971 ****** 30582 1726855308.62181: entering _queue_task() for managed_node3/stat 30582 1726855308.62551: worker is 1 (out of 1 available) 30582 1726855308.62566: exiting _queue_task() for managed_node3/stat 30582 1726855308.62694: done queuing things up, now waiting for results queue to drain 30582 1726855308.62696: waiting for pending results... 30582 1726855308.62981: running TaskExecutor() for managed_node3/TASK: Stat profile file 30582 1726855308.63194: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f14 30582 1726855308.63199: variable 'ansible_search_path' from source: unknown 30582 1726855308.63203: variable 'ansible_search_path' from source: unknown 30582 1726855308.63240: calling self._execute() 30582 1726855308.63429: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.63475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.63479: variable 'omit' from source: magic vars 30582 1726855308.64023: variable 'ansible_distribution_major_version' from source: facts 30582 1726855308.64095: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855308.64098: variable 'omit' from source: magic vars 30582 1726855308.64101: variable 'omit' from source: magic vars 30582 1726855308.64218: variable 'profile' from source: play vars 30582 1726855308.64260: variable 'interface' from source: play vars 30582 1726855308.64328: variable 'interface' from source: play vars 30582 1726855308.64358: variable 'omit' from source: magic vars 30582 1726855308.64406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855308.64449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855308.64475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855308.64508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855308.64530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855308.64729: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855308.64732: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.64735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.64767: Set connection var ansible_timeout to 10 30582 1726855308.64770: Set connection var ansible_connection to ssh 30582 1726855308.64779: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855308.64784: Set connection var ansible_pipelining to False 30582 1726855308.64790: Set connection var ansible_shell_executable to /bin/sh 30582 1726855308.64793: Set connection var ansible_shell_type to sh 30582 1726855308.64839: variable 'ansible_shell_executable' from source: unknown 30582 1726855308.64852: variable 'ansible_connection' from source: unknown 30582 1726855308.64993: variable 'ansible_module_compression' from source: unknown 30582 1726855308.64996: variable 'ansible_shell_type' from source: unknown 30582 1726855308.64999: variable 'ansible_shell_executable' from source: unknown 30582 1726855308.65001: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855308.65004: variable 'ansible_pipelining' from source: unknown 30582 1726855308.65007: variable 'ansible_timeout' from source: unknown 30582 1726855308.65011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855308.65309: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855308.65321: variable 'omit' from source: magic vars 30582 1726855308.65344: starting attempt loop 30582 1726855308.65347: running the handler 30582 1726855308.65390: _low_level_execute_command(): starting 30582 1726855308.65409: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855308.66619: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855308.66772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855308.66778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855308.66781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.66818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.66914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.68605: stdout chunk (state=3): >>>/root <<< 30582 1726855308.68747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855308.68782: stderr chunk (state=3): >>><<< 30582 1726855308.68786: stdout chunk (state=3): >>><<< 30582 1726855308.68912: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855308.68915: _low_level_execute_command(): starting 30582 1726855308.68918: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697 `" && echo ansible-tmp-1726855308.6881225-32684-257593314889697="` echo /root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697 `" ) && sleep 0' 30582 1726855308.69602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855308.69605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.69660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.69714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.71808: stdout chunk (state=3): >>>ansible-tmp-1726855308.6881225-32684-257593314889697=/root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697 <<< 30582 1726855308.71876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855308.71935: stderr chunk (state=3): >>><<< 30582 1726855308.71958: stdout chunk (state=3): >>><<< 30582 1726855308.72028: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855308.6881225-32684-257593314889697=/root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855308.72081: variable 'ansible_module_compression' from source: unknown 30582 1726855308.72184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855308.72260: variable 'ansible_facts' from source: unknown 30582 1726855308.72400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/AnsiballZ_stat.py 30582 1726855308.72715: Sending initial data 30582 1726855308.72718: Sent initial data (153 bytes) 30582 1726855308.73403: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855308.73420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855308.73535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.73583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.73686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.75324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855308.75328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855308.75392: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpmzor81cw /root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/AnsiballZ_stat.py <<< 30582 1726855308.75395: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/AnsiballZ_stat.py" <<< 30582 1726855308.75502: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpmzor81cw" to remote "/root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/AnsiballZ_stat.py" <<< 30582 1726855308.76349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855308.76453: stderr chunk (state=3): >>><<< 30582 1726855308.76463: stdout chunk (state=3): >>><<< 30582 1726855308.76603: done transferring module to remote 30582 1726855308.76640: _low_level_execute_command(): starting 30582 1726855308.76643: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/ /root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/AnsiballZ_stat.py && sleep 0' 30582 1726855308.77536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855308.77540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855308.77542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.77638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.77769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.79578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855308.79651: stderr chunk (state=3): >>><<< 30582 1726855308.79678: stdout chunk (state=3): >>><<< 30582 1726855308.79821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855308.79834: _low_level_execute_command(): starting 30582 1726855308.79838: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/AnsiballZ_stat.py && sleep 0' 30582 1726855308.81134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855308.81196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855308.81229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855308.81356: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855308.81359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855308.81445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.81526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.81624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855308.96658: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855308.98111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855308.98115: stdout chunk (state=3): >>><<< 30582 1726855308.98117: stderr chunk (state=3): >>><<< 30582 1726855308.98135: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855308.98172: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855308.98217: _low_level_execute_command(): starting 30582 1726855308.98220: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855308.6881225-32684-257593314889697/ > /dev/null 2>&1 && sleep 0' 30582 1726855308.98850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855308.98962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855308.99010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855308.99068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855309.01009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855309.01013: stdout chunk (state=3): >>><<< 30582 1726855309.01015: stderr chunk (state=3): >>><<< 30582 1726855309.01032: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855309.01044: handler run complete 30582 1726855309.01092: attempt loop complete, returning result 30582 1726855309.01096: _execute() done 30582 1726855309.01098: dumping result to json 30582 1726855309.01100: done dumping result, returning 30582 1726855309.01193: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcc66-ac2b-aa83-7d57-000000000f14] 30582 1726855309.01200: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f14 30582 1726855309.01280: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f14 30582 1726855309.01284: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855309.01349: no more pending results, returning what we have 30582 1726855309.01353: results queue empty 30582 1726855309.01354: checking for any_errors_fatal 30582 1726855309.01362: done checking for any_errors_fatal 30582 1726855309.01363: checking for max_fail_percentage 30582 1726855309.01366: done checking for max_fail_percentage 30582 1726855309.01367: checking to see if all hosts have failed and the running result is not ok 30582 1726855309.01368: done checking to see if all hosts have failed 30582 1726855309.01368: getting the remaining hosts for this loop 30582 1726855309.01370: done getting the remaining hosts for this loop 30582 1726855309.01377: getting the next task for host managed_node3 30582 1726855309.01386: done getting next task for host managed_node3 30582 1726855309.01392: ^ task is: TASK: Set NM profile exist flag based on the profile files 30582 1726855309.01397: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855309.01403: getting variables 30582 1726855309.01404: in VariableManager get_vars() 30582 1726855309.01442: Calling all_inventory to load vars for managed_node3 30582 1726855309.01445: Calling groups_inventory to load vars for managed_node3 30582 1726855309.01448: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855309.01462: Calling all_plugins_play to load vars for managed_node3 30582 1726855309.01467: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855309.01470: Calling groups_plugins_play to load vars for managed_node3 30582 1726855309.03354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855309.05080: done with get_vars() 30582 1726855309.05118: done getting variables 30582 1726855309.05192: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 14:01:49 -0400 (0:00:00.430) 0:00:45.402 ****** 30582 1726855309.05232: entering _queue_task() for managed_node3/set_fact 30582 1726855309.05860: worker is 1 (out of 1 available) 30582 1726855309.05876: exiting _queue_task() for managed_node3/set_fact 30582 1726855309.05897: done queuing things up, now waiting for results queue to drain 30582 1726855309.05899: waiting for pending results... 30582 1726855309.06210: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30582 1726855309.06218: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f15 30582 1726855309.06236: variable 'ansible_search_path' from source: unknown 30582 1726855309.06239: variable 'ansible_search_path' from source: unknown 30582 1726855309.06280: calling self._execute() 30582 1726855309.06384: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.06399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.06437: variable 'omit' from source: magic vars 30582 1726855309.06842: variable 'ansible_distribution_major_version' from source: facts 30582 1726855309.06860: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855309.07010: variable 'profile_stat' from source: set_fact 30582 1726855309.07194: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855309.07197: when evaluation is False, skipping this task 30582 1726855309.07200: _execute() done 30582 1726855309.07203: dumping result to json 30582 1726855309.07205: done dumping result, returning 30582 1726855309.07208: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-aa83-7d57-000000000f15] 30582 1726855309.07211: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f15 30582 1726855309.07283: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f15 30582 1726855309.07289: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855309.07344: no more pending results, returning what we have 30582 1726855309.07349: results queue empty 30582 1726855309.07350: checking for any_errors_fatal 30582 1726855309.07364: done checking for any_errors_fatal 30582 1726855309.07365: checking for max_fail_percentage 30582 1726855309.07367: done checking for max_fail_percentage 30582 1726855309.07368: checking to see if all hosts have failed and the running result is not ok 30582 1726855309.07369: done checking to see if all hosts have failed 30582 1726855309.07369: getting the remaining hosts for this loop 30582 1726855309.07371: done getting the remaining hosts for this loop 30582 1726855309.07378: getting the next task for host managed_node3 30582 1726855309.07391: done getting next task for host managed_node3 30582 1726855309.07394: ^ task is: TASK: Get NM profile info 30582 1726855309.07404: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855309.07410: getting variables 30582 1726855309.07412: in VariableManager get_vars() 30582 1726855309.07451: Calling all_inventory to load vars for managed_node3 30582 1726855309.07455: Calling groups_inventory to load vars for managed_node3 30582 1726855309.07459: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855309.07477: Calling all_plugins_play to load vars for managed_node3 30582 1726855309.07482: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855309.07486: Calling groups_plugins_play to load vars for managed_node3 30582 1726855309.09210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855309.10910: done with get_vars() 30582 1726855309.10947: done getting variables 30582 1726855309.11018: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 14:01:49 -0400 (0:00:00.058) 0:00:45.460 ****** 30582 1726855309.11059: entering _queue_task() for managed_node3/shell 30582 1726855309.11452: worker is 1 (out of 1 available) 30582 1726855309.11468: exiting _queue_task() for managed_node3/shell 30582 1726855309.11486: done queuing things up, now waiting for results queue to drain 30582 1726855309.11489: waiting for pending results... 30582 1726855309.11825: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30582 1726855309.11893: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f16 30582 1726855309.11898: variable 'ansible_search_path' from source: unknown 30582 1726855309.11901: variable 'ansible_search_path' from source: unknown 30582 1726855309.12020: calling self._execute() 30582 1726855309.12025: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.12028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.12031: variable 'omit' from source: magic vars 30582 1726855309.12368: variable 'ansible_distribution_major_version' from source: facts 30582 1726855309.12379: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855309.12388: variable 'omit' from source: magic vars 30582 1726855309.12445: variable 'omit' from source: magic vars 30582 1726855309.12562: variable 'profile' from source: play vars 30582 1726855309.12565: variable 'interface' from source: play vars 30582 1726855309.12607: variable 'interface' from source: play vars 30582 1726855309.12671: variable 'omit' from source: magic vars 30582 1726855309.12677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855309.12705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855309.12727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855309.12745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855309.12757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855309.12792: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855309.12796: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.12798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.13011: Set connection var ansible_timeout to 10 30582 1726855309.13014: Set connection var ansible_connection to ssh 30582 1726855309.13017: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855309.13019: Set connection var ansible_pipelining to False 30582 1726855309.13021: Set connection var ansible_shell_executable to /bin/sh 30582 1726855309.13024: Set connection var ansible_shell_type to sh 30582 1726855309.13026: variable 'ansible_shell_executable' from source: unknown 30582 1726855309.13028: variable 'ansible_connection' from source: unknown 30582 1726855309.13030: variable 'ansible_module_compression' from source: unknown 30582 1726855309.13032: variable 'ansible_shell_type' from source: unknown 30582 1726855309.13034: variable 'ansible_shell_executable' from source: unknown 30582 1726855309.13036: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.13038: variable 'ansible_pipelining' from source: unknown 30582 1726855309.13040: variable 'ansible_timeout' from source: unknown 30582 1726855309.13042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.13103: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855309.13119: variable 'omit' from source: magic vars 30582 1726855309.13122: starting attempt loop 30582 1726855309.13125: running the handler 30582 1726855309.13157: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855309.13161: _low_level_execute_command(): starting 30582 1726855309.13163: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855309.13847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855309.13865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855309.13919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855309.13924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855309.13926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855309.13997: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855309.14019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855309.14114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855309.15818: stdout chunk (state=3): >>>/root <<< 30582 1726855309.15984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855309.15990: stdout chunk (state=3): >>><<< 30582 1726855309.15993: stderr chunk (state=3): >>><<< 30582 1726855309.16019: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855309.16123: _low_level_execute_command(): starting 30582 1726855309.16126: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307 `" && echo ansible-tmp-1726855309.160262-32723-1662494958307="` echo /root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307 `" ) && sleep 0' 30582 1726855309.16684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855309.16708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855309.16724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855309.16750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855309.16765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855309.16780: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855309.16854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855309.16893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855309.16911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855309.16932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855309.17031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855309.18919: stdout chunk (state=3): >>>ansible-tmp-1726855309.160262-32723-1662494958307=/root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307 <<< 30582 1726855309.19064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855309.19095: stderr chunk (state=3): >>><<< 30582 1726855309.19099: stdout chunk (state=3): >>><<< 30582 1726855309.19118: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855309.160262-32723-1662494958307=/root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855309.19144: variable 'ansible_module_compression' from source: unknown 30582 1726855309.19186: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855309.19221: variable 'ansible_facts' from source: unknown 30582 1726855309.19492: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/AnsiballZ_command.py 30582 1726855309.19496: Sending initial data 30582 1726855309.19498: Sent initial data (153 bytes) 30582 1726855309.19855: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855309.19869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855309.19885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855309.19927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855309.19942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855309.20007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855309.21579: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855309.21633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855309.21697: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp6ohlxh2_ /root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/AnsiballZ_command.py <<< 30582 1726855309.21701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/AnsiballZ_command.py" <<< 30582 1726855309.21755: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp6ohlxh2_" to remote "/root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/AnsiballZ_command.py" <<< 30582 1726855309.21760: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/AnsiballZ_command.py" <<< 30582 1726855309.22336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855309.22376: stderr chunk (state=3): >>><<< 30582 1726855309.22380: stdout chunk (state=3): >>><<< 30582 1726855309.22429: done transferring module to remote 30582 1726855309.22439: _low_level_execute_command(): starting 30582 1726855309.22442: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/ /root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/AnsiballZ_command.py && sleep 0' 30582 1726855309.22868: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855309.22908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855309.22915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855309.22917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855309.22919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855309.22922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855309.22967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855309.22970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855309.22973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855309.23033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855309.24786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855309.24814: stderr chunk (state=3): >>><<< 30582 1726855309.24819: stdout chunk (state=3): >>><<< 30582 1726855309.24836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855309.24839: _low_level_execute_command(): starting 30582 1726855309.24844: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/AnsiballZ_command.py && sleep 0' 30582 1726855309.25285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855309.25309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855309.25312: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855309.25315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855309.25368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855309.25376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855309.25379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855309.25435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855309.42729: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:01:49.407486", "end": "2024-09-20 14:01:49.425164", "delta": "0:00:00.017678", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855309.44146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855309.44151: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855309.44209: stderr chunk (state=3): >>><<< 30582 1726855309.44214: stdout chunk (state=3): >>><<< 30582 1726855309.44242: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:01:49.407486", "end": "2024-09-20 14:01:49.425164", "delta": "0:00:00.017678", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855309.44302: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855309.44311: _low_level_execute_command(): starting 30582 1726855309.44314: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855309.160262-32723-1662494958307/ > /dev/null 2>&1 && sleep 0' 30582 1726855309.44949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855309.44952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855309.44955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855309.44957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855309.44959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855309.44961: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855309.44963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855309.44972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855309.44979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855309.44988: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855309.45002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855309.45008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855309.45020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855309.45029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855309.45265: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855309.45437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855309.45526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855309.47370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855309.47509: stderr chunk (state=3): >>><<< 30582 1726855309.47513: stdout chunk (state=3): >>><<< 30582 1726855309.47515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855309.47518: handler run complete 30582 1726855309.47520: Evaluated conditional (False): False 30582 1726855309.47522: attempt loop complete, returning result 30582 1726855309.47524: _execute() done 30582 1726855309.47526: dumping result to json 30582 1726855309.47528: done dumping result, returning 30582 1726855309.47530: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcc66-ac2b-aa83-7d57-000000000f16] 30582 1726855309.47532: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f16 30582 1726855309.47868: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f16 30582 1726855309.47871: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017678", "end": "2024-09-20 14:01:49.425164", "rc": 0, "start": "2024-09-20 14:01:49.407486" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30582 1726855309.47957: no more pending results, returning what we have 30582 1726855309.47960: results queue empty 30582 1726855309.47961: checking for any_errors_fatal 30582 1726855309.47966: done checking for any_errors_fatal 30582 1726855309.47967: checking for max_fail_percentage 30582 1726855309.47975: done checking for max_fail_percentage 30582 1726855309.47977: checking to see if all hosts have failed and the running result is not ok 30582 1726855309.47977: done checking to see if all hosts have failed 30582 1726855309.47978: getting the remaining hosts for this loop 30582 1726855309.47983: done getting the remaining hosts for this loop 30582 1726855309.47988: getting the next task for host managed_node3 30582 1726855309.47996: done getting next task for host managed_node3 30582 1726855309.48113: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855309.48118: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855309.48123: getting variables 30582 1726855309.48124: in VariableManager get_vars() 30582 1726855309.48155: Calling all_inventory to load vars for managed_node3 30582 1726855309.48158: Calling groups_inventory to load vars for managed_node3 30582 1726855309.48161: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855309.48171: Calling all_plugins_play to load vars for managed_node3 30582 1726855309.48177: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855309.48181: Calling groups_plugins_play to load vars for managed_node3 30582 1726855309.51855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855309.55180: done with get_vars() 30582 1726855309.55323: done getting variables 30582 1726855309.55384: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 14:01:49 -0400 (0:00:00.443) 0:00:45.904 ****** 30582 1726855309.55428: entering _queue_task() for managed_node3/set_fact 30582 1726855309.56316: worker is 1 (out of 1 available) 30582 1726855309.56330: exiting _queue_task() for managed_node3/set_fact 30582 1726855309.56343: done queuing things up, now waiting for results queue to drain 30582 1726855309.56344: waiting for pending results... 30582 1726855309.57027: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855309.57150: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f17 30582 1726855309.57450: variable 'ansible_search_path' from source: unknown 30582 1726855309.57453: variable 'ansible_search_path' from source: unknown 30582 1726855309.57457: calling self._execute() 30582 1726855309.57511: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.57522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.57570: variable 'omit' from source: magic vars 30582 1726855309.58398: variable 'ansible_distribution_major_version' from source: facts 30582 1726855309.58419: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855309.58742: variable 'nm_profile_exists' from source: set_fact 30582 1726855309.58768: Evaluated conditional (nm_profile_exists.rc == 0): True 30582 1726855309.58821: variable 'omit' from source: magic vars 30582 1726855309.58948: variable 'omit' from source: magic vars 30582 1726855309.59048: variable 'omit' from source: magic vars 30582 1726855309.59158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855309.59259: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855309.59286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855309.59411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855309.59415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855309.59520: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855309.59523: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.59526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.59721: Set connection var ansible_timeout to 10 30582 1726855309.59742: Set connection var ansible_connection to ssh 30582 1726855309.59756: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855309.59802: Set connection var ansible_pipelining to False 30582 1726855309.59814: Set connection var ansible_shell_executable to /bin/sh 30582 1726855309.59821: Set connection var ansible_shell_type to sh 30582 1726855309.59873: variable 'ansible_shell_executable' from source: unknown 30582 1726855309.59908: variable 'ansible_connection' from source: unknown 30582 1726855309.59916: variable 'ansible_module_compression' from source: unknown 30582 1726855309.60010: variable 'ansible_shell_type' from source: unknown 30582 1726855309.60013: variable 'ansible_shell_executable' from source: unknown 30582 1726855309.60015: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.60016: variable 'ansible_pipelining' from source: unknown 30582 1726855309.60018: variable 'ansible_timeout' from source: unknown 30582 1726855309.60020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.60295: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855309.60392: variable 'omit' from source: magic vars 30582 1726855309.60395: starting attempt loop 30582 1726855309.60398: running the handler 30582 1726855309.60407: handler run complete 30582 1726855309.60422: attempt loop complete, returning result 30582 1726855309.60429: _execute() done 30582 1726855309.60436: dumping result to json 30582 1726855309.60449: done dumping result, returning 30582 1726855309.60553: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-aa83-7d57-000000000f17] 30582 1726855309.60556: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f17 30582 1726855309.60892: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f17 30582 1726855309.60896: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30582 1726855309.60953: no more pending results, returning what we have 30582 1726855309.60956: results queue empty 30582 1726855309.60957: checking for any_errors_fatal 30582 1726855309.60968: done checking for any_errors_fatal 30582 1726855309.60968: checking for max_fail_percentage 30582 1726855309.60970: done checking for max_fail_percentage 30582 1726855309.60971: checking to see if all hosts have failed and the running result is not ok 30582 1726855309.60972: done checking to see if all hosts have failed 30582 1726855309.60973: getting the remaining hosts for this loop 30582 1726855309.60974: done getting the remaining hosts for this loop 30582 1726855309.60978: getting the next task for host managed_node3 30582 1726855309.60992: done getting next task for host managed_node3 30582 1726855309.60995: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855309.61001: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855309.61008: getting variables 30582 1726855309.61010: in VariableManager get_vars() 30582 1726855309.61045: Calling all_inventory to load vars for managed_node3 30582 1726855309.61048: Calling groups_inventory to load vars for managed_node3 30582 1726855309.61051: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855309.61063: Calling all_plugins_play to load vars for managed_node3 30582 1726855309.61066: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855309.61069: Calling groups_plugins_play to load vars for managed_node3 30582 1726855309.64027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855309.67795: done with get_vars() 30582 1726855309.67824: done getting variables 30582 1726855309.68004: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855309.68240: variable 'profile' from source: play vars 30582 1726855309.68244: variable 'interface' from source: play vars 30582 1726855309.68303: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 14:01:49 -0400 (0:00:00.129) 0:00:46.034 ****** 30582 1726855309.68428: entering _queue_task() for managed_node3/command 30582 1726855309.69230: worker is 1 (out of 1 available) 30582 1726855309.69244: exiting _queue_task() for managed_node3/command 30582 1726855309.69257: done queuing things up, now waiting for results queue to drain 30582 1726855309.69258: waiting for pending results... 30582 1726855309.69819: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30582 1726855309.70024: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f19 30582 1726855309.70048: variable 'ansible_search_path' from source: unknown 30582 1726855309.70131: variable 'ansible_search_path' from source: unknown 30582 1726855309.70349: calling self._execute() 30582 1726855309.70353: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.70568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.70572: variable 'omit' from source: magic vars 30582 1726855309.71193: variable 'ansible_distribution_major_version' from source: facts 30582 1726855309.71235: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855309.71431: variable 'profile_stat' from source: set_fact 30582 1726855309.71564: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855309.71572: when evaluation is False, skipping this task 30582 1726855309.71579: _execute() done 30582 1726855309.71768: dumping result to json 30582 1726855309.71771: done dumping result, returning 30582 1726855309.71774: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000000f19] 30582 1726855309.71776: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f19 30582 1726855309.71855: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f19 30582 1726855309.71858: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855309.71925: no more pending results, returning what we have 30582 1726855309.71929: results queue empty 30582 1726855309.71930: checking for any_errors_fatal 30582 1726855309.71938: done checking for any_errors_fatal 30582 1726855309.71939: checking for max_fail_percentage 30582 1726855309.71942: done checking for max_fail_percentage 30582 1726855309.71943: checking to see if all hosts have failed and the running result is not ok 30582 1726855309.71944: done checking to see if all hosts have failed 30582 1726855309.71944: getting the remaining hosts for this loop 30582 1726855309.71946: done getting the remaining hosts for this loop 30582 1726855309.71950: getting the next task for host managed_node3 30582 1726855309.71960: done getting next task for host managed_node3 30582 1726855309.71962: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855309.71968: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855309.71973: getting variables 30582 1726855309.71975: in VariableManager get_vars() 30582 1726855309.72017: Calling all_inventory to load vars for managed_node3 30582 1726855309.72020: Calling groups_inventory to load vars for managed_node3 30582 1726855309.72026: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855309.72041: Calling all_plugins_play to load vars for managed_node3 30582 1726855309.72045: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855309.72048: Calling groups_plugins_play to load vars for managed_node3 30582 1726855309.75166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855309.77080: done with get_vars() 30582 1726855309.77114: done getting variables 30582 1726855309.77179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855309.77298: variable 'profile' from source: play vars 30582 1726855309.77302: variable 'interface' from source: play vars 30582 1726855309.77360: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 14:01:49 -0400 (0:00:00.089) 0:00:46.123 ****** 30582 1726855309.77399: entering _queue_task() for managed_node3/set_fact 30582 1726855309.77772: worker is 1 (out of 1 available) 30582 1726855309.77995: exiting _queue_task() for managed_node3/set_fact 30582 1726855309.78008: done queuing things up, now waiting for results queue to drain 30582 1726855309.78009: waiting for pending results... 30582 1726855309.78216: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30582 1726855309.78239: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f1a 30582 1726855309.78256: variable 'ansible_search_path' from source: unknown 30582 1726855309.78260: variable 'ansible_search_path' from source: unknown 30582 1726855309.78421: calling self._execute() 30582 1726855309.78424: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.78427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.78430: variable 'omit' from source: magic vars 30582 1726855309.78832: variable 'ansible_distribution_major_version' from source: facts 30582 1726855309.78855: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855309.79015: variable 'profile_stat' from source: set_fact 30582 1726855309.79018: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855309.79020: when evaluation is False, skipping this task 30582 1726855309.79022: _execute() done 30582 1726855309.79024: dumping result to json 30582 1726855309.79026: done dumping result, returning 30582 1726855309.79029: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000000f1a] 30582 1726855309.79031: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f1a 30582 1726855309.79114: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f1a 30582 1726855309.79121: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855309.79240: no more pending results, returning what we have 30582 1726855309.79249: results queue empty 30582 1726855309.79251: checking for any_errors_fatal 30582 1726855309.79258: done checking for any_errors_fatal 30582 1726855309.79259: checking for max_fail_percentage 30582 1726855309.79262: done checking for max_fail_percentage 30582 1726855309.79262: checking to see if all hosts have failed and the running result is not ok 30582 1726855309.79263: done checking to see if all hosts have failed 30582 1726855309.79264: getting the remaining hosts for this loop 30582 1726855309.79265: done getting the remaining hosts for this loop 30582 1726855309.79270: getting the next task for host managed_node3 30582 1726855309.79281: done getting next task for host managed_node3 30582 1726855309.79284: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30582 1726855309.79317: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855309.79326: getting variables 30582 1726855309.79328: in VariableManager get_vars() 30582 1726855309.79367: Calling all_inventory to load vars for managed_node3 30582 1726855309.79370: Calling groups_inventory to load vars for managed_node3 30582 1726855309.79376: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855309.79396: Calling all_plugins_play to load vars for managed_node3 30582 1726855309.79401: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855309.79408: Calling groups_plugins_play to load vars for managed_node3 30582 1726855309.82283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855309.84622: done with get_vars() 30582 1726855309.84653: done getting variables 30582 1726855309.84719: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855309.85018: variable 'profile' from source: play vars 30582 1726855309.85023: variable 'interface' from source: play vars 30582 1726855309.85085: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 14:01:49 -0400 (0:00:00.077) 0:00:46.201 ****** 30582 1726855309.85120: entering _queue_task() for managed_node3/command 30582 1726855309.85567: worker is 1 (out of 1 available) 30582 1726855309.85585: exiting _queue_task() for managed_node3/command 30582 1726855309.85599: done queuing things up, now waiting for results queue to drain 30582 1726855309.85601: waiting for pending results... 30582 1726855309.85952: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30582 1726855309.86231: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f1b 30582 1726855309.86438: variable 'ansible_search_path' from source: unknown 30582 1726855309.86444: variable 'ansible_search_path' from source: unknown 30582 1726855309.86448: calling self._execute() 30582 1726855309.86544: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.86602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.86770: variable 'omit' from source: magic vars 30582 1726855309.87379: variable 'ansible_distribution_major_version' from source: facts 30582 1726855309.87572: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855309.87978: variable 'profile_stat' from source: set_fact 30582 1726855309.88270: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855309.88306: when evaluation is False, skipping this task 30582 1726855309.88339: _execute() done 30582 1726855309.88368: dumping result to json 30582 1726855309.88396: done dumping result, returning 30582 1726855309.88445: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000000f1b] 30582 1726855309.88449: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f1b skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855309.88614: no more pending results, returning what we have 30582 1726855309.88618: results queue empty 30582 1726855309.88620: checking for any_errors_fatal 30582 1726855309.88629: done checking for any_errors_fatal 30582 1726855309.88630: checking for max_fail_percentage 30582 1726855309.88632: done checking for max_fail_percentage 30582 1726855309.88633: checking to see if all hosts have failed and the running result is not ok 30582 1726855309.88633: done checking to see if all hosts have failed 30582 1726855309.88634: getting the remaining hosts for this loop 30582 1726855309.88635: done getting the remaining hosts for this loop 30582 1726855309.88639: getting the next task for host managed_node3 30582 1726855309.88650: done getting next task for host managed_node3 30582 1726855309.88652: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30582 1726855309.88657: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855309.88662: getting variables 30582 1726855309.88664: in VariableManager get_vars() 30582 1726855309.88711: Calling all_inventory to load vars for managed_node3 30582 1726855309.88715: Calling groups_inventory to load vars for managed_node3 30582 1726855309.88718: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855309.88741: Calling all_plugins_play to load vars for managed_node3 30582 1726855309.88745: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855309.88749: Calling groups_plugins_play to load vars for managed_node3 30582 1726855309.89500: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f1b 30582 1726855309.89505: WORKER PROCESS EXITING 30582 1726855309.91793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855309.95354: done with get_vars() 30582 1726855309.95406: done getting variables 30582 1726855309.95823: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855309.96363: variable 'profile' from source: play vars 30582 1726855309.96368: variable 'interface' from source: play vars 30582 1726855309.96692: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 14:01:49 -0400 (0:00:00.116) 0:00:46.317 ****** 30582 1726855309.96733: entering _queue_task() for managed_node3/set_fact 30582 1726855309.98054: worker is 1 (out of 1 available) 30582 1726855309.98067: exiting _queue_task() for managed_node3/set_fact 30582 1726855309.98082: done queuing things up, now waiting for results queue to drain 30582 1726855309.98084: waiting for pending results... 30582 1726855309.98590: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30582 1726855309.98995: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f1c 30582 1726855309.99003: variable 'ansible_search_path' from source: unknown 30582 1726855309.99007: variable 'ansible_search_path' from source: unknown 30582 1726855309.99012: calling self._execute() 30582 1726855309.99376: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855309.99380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855309.99383: variable 'omit' from source: magic vars 30582 1726855310.00107: variable 'ansible_distribution_major_version' from source: facts 30582 1726855310.00168: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855310.00241: variable 'profile_stat' from source: set_fact 30582 1726855310.00252: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855310.00256: when evaluation is False, skipping this task 30582 1726855310.00266: _execute() done 30582 1726855310.00280: dumping result to json 30582 1726855310.00282: done dumping result, returning 30582 1726855310.00285: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000000f1c] 30582 1726855310.00289: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f1c 30582 1726855310.00691: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f1c 30582 1726855310.00695: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855310.00771: no more pending results, returning what we have 30582 1726855310.00777: results queue empty 30582 1726855310.00779: checking for any_errors_fatal 30582 1726855310.00789: done checking for any_errors_fatal 30582 1726855310.00790: checking for max_fail_percentage 30582 1726855310.00793: done checking for max_fail_percentage 30582 1726855310.00794: checking to see if all hosts have failed and the running result is not ok 30582 1726855310.00794: done checking to see if all hosts have failed 30582 1726855310.00795: getting the remaining hosts for this loop 30582 1726855310.00797: done getting the remaining hosts for this loop 30582 1726855310.00801: getting the next task for host managed_node3 30582 1726855310.00812: done getting next task for host managed_node3 30582 1726855310.00835: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30582 1726855310.00841: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855310.00847: getting variables 30582 1726855310.00849: in VariableManager get_vars() 30582 1726855310.00994: Calling all_inventory to load vars for managed_node3 30582 1726855310.00997: Calling groups_inventory to load vars for managed_node3 30582 1726855310.01000: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855310.01010: Calling all_plugins_play to load vars for managed_node3 30582 1726855310.01013: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855310.01015: Calling groups_plugins_play to load vars for managed_node3 30582 1726855310.19394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855310.23940: done with get_vars() 30582 1726855310.23975: done getting variables 30582 1726855310.24039: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855310.24144: variable 'profile' from source: play vars 30582 1726855310.24148: variable 'interface' from source: play vars 30582 1726855310.24212: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 14:01:50 -0400 (0:00:00.275) 0:00:46.592 ****** 30582 1726855310.24240: entering _queue_task() for managed_node3/assert 30582 1726855310.24814: worker is 1 (out of 1 available) 30582 1726855310.24824: exiting _queue_task() for managed_node3/assert 30582 1726855310.24835: done queuing things up, now waiting for results queue to drain 30582 1726855310.24836: waiting for pending results... 30582 1726855310.25207: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' 30582 1726855310.25213: in run() - task 0affcc66-ac2b-aa83-7d57-000000000e8c 30582 1726855310.25218: variable 'ansible_search_path' from source: unknown 30582 1726855310.25221: variable 'ansible_search_path' from source: unknown 30582 1726855310.25225: calling self._execute() 30582 1726855310.25235: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.25241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.25250: variable 'omit' from source: magic vars 30582 1726855310.25645: variable 'ansible_distribution_major_version' from source: facts 30582 1726855310.25657: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855310.25897: variable 'omit' from source: magic vars 30582 1726855310.25901: variable 'omit' from source: magic vars 30582 1726855310.25904: variable 'profile' from source: play vars 30582 1726855310.25908: variable 'interface' from source: play vars 30582 1726855310.25912: variable 'interface' from source: play vars 30582 1726855310.25915: variable 'omit' from source: magic vars 30582 1726855310.26162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855310.26165: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855310.26168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855310.26170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855310.26173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855310.26175: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855310.26178: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.26180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.26233: Set connection var ansible_timeout to 10 30582 1726855310.26236: Set connection var ansible_connection to ssh 30582 1726855310.26239: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855310.26241: Set connection var ansible_pipelining to False 30582 1726855310.26243: Set connection var ansible_shell_executable to /bin/sh 30582 1726855310.26246: Set connection var ansible_shell_type to sh 30582 1726855310.26294: variable 'ansible_shell_executable' from source: unknown 30582 1726855310.26298: variable 'ansible_connection' from source: unknown 30582 1726855310.26300: variable 'ansible_module_compression' from source: unknown 30582 1726855310.26302: variable 'ansible_shell_type' from source: unknown 30582 1726855310.26304: variable 'ansible_shell_executable' from source: unknown 30582 1726855310.26306: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.26308: variable 'ansible_pipelining' from source: unknown 30582 1726855310.26310: variable 'ansible_timeout' from source: unknown 30582 1726855310.26312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.26436: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855310.26446: variable 'omit' from source: magic vars 30582 1726855310.26452: starting attempt loop 30582 1726855310.26455: running the handler 30582 1726855310.26768: variable 'lsr_net_profile_exists' from source: set_fact 30582 1726855310.26771: Evaluated conditional (lsr_net_profile_exists): True 30582 1726855310.26773: handler run complete 30582 1726855310.26967: attempt loop complete, returning result 30582 1726855310.26970: _execute() done 30582 1726855310.26973: dumping result to json 30582 1726855310.26975: done dumping result, returning 30582 1726855310.26977: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'statebr' [0affcc66-ac2b-aa83-7d57-000000000e8c] 30582 1726855310.26978: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e8c 30582 1726855310.27068: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e8c 30582 1726855310.27071: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855310.27131: no more pending results, returning what we have 30582 1726855310.27135: results queue empty 30582 1726855310.27136: checking for any_errors_fatal 30582 1726855310.27144: done checking for any_errors_fatal 30582 1726855310.27145: checking for max_fail_percentage 30582 1726855310.27147: done checking for max_fail_percentage 30582 1726855310.27149: checking to see if all hosts have failed and the running result is not ok 30582 1726855310.27149: done checking to see if all hosts have failed 30582 1726855310.27150: getting the remaining hosts for this loop 30582 1726855310.27151: done getting the remaining hosts for this loop 30582 1726855310.27155: getting the next task for host managed_node3 30582 1726855310.27164: done getting next task for host managed_node3 30582 1726855310.27167: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30582 1726855310.27172: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855310.27180: getting variables 30582 1726855310.27182: in VariableManager get_vars() 30582 1726855310.27326: Calling all_inventory to load vars for managed_node3 30582 1726855310.27329: Calling groups_inventory to load vars for managed_node3 30582 1726855310.27333: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855310.27346: Calling all_plugins_play to load vars for managed_node3 30582 1726855310.27349: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855310.27353: Calling groups_plugins_play to load vars for managed_node3 30582 1726855310.29418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855310.31854: done with get_vars() 30582 1726855310.31966: done getting variables 30582 1726855310.32033: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855310.32275: variable 'profile' from source: play vars 30582 1726855310.32280: variable 'interface' from source: play vars 30582 1726855310.32451: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 14:01:50 -0400 (0:00:00.082) 0:00:46.674 ****** 30582 1726855310.32496: entering _queue_task() for managed_node3/assert 30582 1726855310.33309: worker is 1 (out of 1 available) 30582 1726855310.33320: exiting _queue_task() for managed_node3/assert 30582 1726855310.33331: done queuing things up, now waiting for results queue to drain 30582 1726855310.33332: waiting for pending results... 30582 1726855310.34109: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' 30582 1726855310.34391: in run() - task 0affcc66-ac2b-aa83-7d57-000000000e8d 30582 1726855310.34396: variable 'ansible_search_path' from source: unknown 30582 1726855310.34399: variable 'ansible_search_path' from source: unknown 30582 1726855310.34402: calling self._execute() 30582 1726855310.34443: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.34450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.34581: variable 'omit' from source: magic vars 30582 1726855310.35336: variable 'ansible_distribution_major_version' from source: facts 30582 1726855310.35348: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855310.35354: variable 'omit' from source: magic vars 30582 1726855310.35409: variable 'omit' from source: magic vars 30582 1726855310.35689: variable 'profile' from source: play vars 30582 1726855310.35769: variable 'interface' from source: play vars 30582 1726855310.35836: variable 'interface' from source: play vars 30582 1726855310.35855: variable 'omit' from source: magic vars 30582 1726855310.36015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855310.36154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855310.36158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855310.36203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855310.36218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855310.36246: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855310.36250: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.36252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.36466: Set connection var ansible_timeout to 10 30582 1726855310.36470: Set connection var ansible_connection to ssh 30582 1726855310.36478: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855310.36484: Set connection var ansible_pipelining to False 30582 1726855310.36490: Set connection var ansible_shell_executable to /bin/sh 30582 1726855310.36598: Set connection var ansible_shell_type to sh 30582 1726855310.36620: variable 'ansible_shell_executable' from source: unknown 30582 1726855310.36623: variable 'ansible_connection' from source: unknown 30582 1726855310.36631: variable 'ansible_module_compression' from source: unknown 30582 1726855310.36642: variable 'ansible_shell_type' from source: unknown 30582 1726855310.36644: variable 'ansible_shell_executable' from source: unknown 30582 1726855310.36647: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.36651: variable 'ansible_pipelining' from source: unknown 30582 1726855310.36654: variable 'ansible_timeout' from source: unknown 30582 1726855310.36656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.37021: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855310.37024: variable 'omit' from source: magic vars 30582 1726855310.37026: starting attempt loop 30582 1726855310.37029: running the handler 30582 1726855310.37246: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30582 1726855310.37251: Evaluated conditional (lsr_net_profile_ansible_managed): True 30582 1726855310.37257: handler run complete 30582 1726855310.37272: attempt loop complete, returning result 30582 1726855310.37274: _execute() done 30582 1726855310.37397: dumping result to json 30582 1726855310.37401: done dumping result, returning 30582 1726855310.37412: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'statebr' [0affcc66-ac2b-aa83-7d57-000000000e8d] 30582 1726855310.37418: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e8d 30582 1726855310.37509: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e8d 30582 1726855310.37513: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855310.37565: no more pending results, returning what we have 30582 1726855310.37569: results queue empty 30582 1726855310.37570: checking for any_errors_fatal 30582 1726855310.37580: done checking for any_errors_fatal 30582 1726855310.37581: checking for max_fail_percentage 30582 1726855310.37583: done checking for max_fail_percentage 30582 1726855310.37584: checking to see if all hosts have failed and the running result is not ok 30582 1726855310.37585: done checking to see if all hosts have failed 30582 1726855310.37586: getting the remaining hosts for this loop 30582 1726855310.37589: done getting the remaining hosts for this loop 30582 1726855310.37593: getting the next task for host managed_node3 30582 1726855310.37602: done getting next task for host managed_node3 30582 1726855310.37606: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30582 1726855310.37611: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855310.37616: getting variables 30582 1726855310.37618: in VariableManager get_vars() 30582 1726855310.37656: Calling all_inventory to load vars for managed_node3 30582 1726855310.37659: Calling groups_inventory to load vars for managed_node3 30582 1726855310.37663: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855310.37679: Calling all_plugins_play to load vars for managed_node3 30582 1726855310.37683: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855310.37686: Calling groups_plugins_play to load vars for managed_node3 30582 1726855310.41210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855310.44668: done with get_vars() 30582 1726855310.44818: done getting variables 30582 1726855310.44882: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855310.45206: variable 'profile' from source: play vars 30582 1726855310.45210: variable 'interface' from source: play vars 30582 1726855310.45281: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 14:01:50 -0400 (0:00:00.128) 0:00:46.803 ****** 30582 1726855310.45317: entering _queue_task() for managed_node3/assert 30582 1726855310.46168: worker is 1 (out of 1 available) 30582 1726855310.46184: exiting _queue_task() for managed_node3/assert 30582 1726855310.46500: done queuing things up, now waiting for results queue to drain 30582 1726855310.46502: waiting for pending results... 30582 1726855310.46917: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr 30582 1726855310.47145: in run() - task 0affcc66-ac2b-aa83-7d57-000000000e8e 30582 1726855310.47150: variable 'ansible_search_path' from source: unknown 30582 1726855310.47153: variable 'ansible_search_path' from source: unknown 30582 1726855310.47326: calling self._execute() 30582 1726855310.47515: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.47519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.47523: variable 'omit' from source: magic vars 30582 1726855310.48382: variable 'ansible_distribution_major_version' from source: facts 30582 1726855310.48386: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855310.48391: variable 'omit' from source: magic vars 30582 1726855310.48595: variable 'omit' from source: magic vars 30582 1726855310.48674: variable 'profile' from source: play vars 30582 1726855310.48684: variable 'interface' from source: play vars 30582 1726855310.48746: variable 'interface' from source: play vars 30582 1726855310.48764: variable 'omit' from source: magic vars 30582 1726855310.48923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855310.48958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855310.48981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855310.49106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855310.49122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855310.49152: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855310.49155: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.49158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.49392: Set connection var ansible_timeout to 10 30582 1726855310.49498: Set connection var ansible_connection to ssh 30582 1726855310.49507: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855310.49512: Set connection var ansible_pipelining to False 30582 1726855310.49517: Set connection var ansible_shell_executable to /bin/sh 30582 1726855310.49520: Set connection var ansible_shell_type to sh 30582 1726855310.49635: variable 'ansible_shell_executable' from source: unknown 30582 1726855310.49639: variable 'ansible_connection' from source: unknown 30582 1726855310.49641: variable 'ansible_module_compression' from source: unknown 30582 1726855310.49643: variable 'ansible_shell_type' from source: unknown 30582 1726855310.49645: variable 'ansible_shell_executable' from source: unknown 30582 1726855310.49647: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.49648: variable 'ansible_pipelining' from source: unknown 30582 1726855310.49651: variable 'ansible_timeout' from source: unknown 30582 1726855310.49652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.49925: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855310.49937: variable 'omit' from source: magic vars 30582 1726855310.49942: starting attempt loop 30582 1726855310.49945: running the handler 30582 1726855310.50181: variable 'lsr_net_profile_fingerprint' from source: set_fact 30582 1726855310.50221: Evaluated conditional (lsr_net_profile_fingerprint): True 30582 1726855310.50306: handler run complete 30582 1726855310.50329: attempt loop complete, returning result 30582 1726855310.50332: _execute() done 30582 1726855310.50335: dumping result to json 30582 1726855310.50337: done dumping result, returning 30582 1726855310.50340: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in statebr [0affcc66-ac2b-aa83-7d57-000000000e8e] 30582 1726855310.50342: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e8e ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855310.50720: no more pending results, returning what we have 30582 1726855310.50724: results queue empty 30582 1726855310.50725: checking for any_errors_fatal 30582 1726855310.50734: done checking for any_errors_fatal 30582 1726855310.50735: checking for max_fail_percentage 30582 1726855310.50738: done checking for max_fail_percentage 30582 1726855310.50739: checking to see if all hosts have failed and the running result is not ok 30582 1726855310.50739: done checking to see if all hosts have failed 30582 1726855310.50740: getting the remaining hosts for this loop 30582 1726855310.50742: done getting the remaining hosts for this loop 30582 1726855310.50746: getting the next task for host managed_node3 30582 1726855310.50757: done getting next task for host managed_node3 30582 1726855310.50760: ^ task is: TASK: Conditional asserts 30582 1726855310.50764: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855310.50769: getting variables 30582 1726855310.50771: in VariableManager get_vars() 30582 1726855310.50815: Calling all_inventory to load vars for managed_node3 30582 1726855310.50818: Calling groups_inventory to load vars for managed_node3 30582 1726855310.50822: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855310.50835: Calling all_plugins_play to load vars for managed_node3 30582 1726855310.50839: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855310.50844: Calling groups_plugins_play to load vars for managed_node3 30582 1726855310.51394: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000e8e 30582 1726855310.51397: WORKER PROCESS EXITING 30582 1726855310.53980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855310.57764: done with get_vars() 30582 1726855310.57793: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 14:01:50 -0400 (0:00:00.127) 0:00:46.930 ****** 30582 1726855310.58054: entering _queue_task() for managed_node3/include_tasks 30582 1726855310.59023: worker is 1 (out of 1 available) 30582 1726855310.59035: exiting _queue_task() for managed_node3/include_tasks 30582 1726855310.59045: done queuing things up, now waiting for results queue to drain 30582 1726855310.59047: waiting for pending results... 30582 1726855310.59370: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30582 1726855310.59698: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a4f 30582 1726855310.59719: variable 'ansible_search_path' from source: unknown 30582 1726855310.59723: variable 'ansible_search_path' from source: unknown 30582 1726855310.60396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855310.64990: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855310.65263: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855310.65311: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855310.65345: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855310.65371: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855310.65583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855310.65795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855310.65798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855310.65801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855310.65806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855310.66155: dumping result to json 30582 1726855310.66159: done dumping result, returning 30582 1726855310.66166: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcc66-ac2b-aa83-7d57-000000000a4f] 30582 1726855310.66378: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4f skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 30582 1726855310.66524: no more pending results, returning what we have 30582 1726855310.66529: results queue empty 30582 1726855310.66530: checking for any_errors_fatal 30582 1726855310.66538: done checking for any_errors_fatal 30582 1726855310.66539: checking for max_fail_percentage 30582 1726855310.66541: done checking for max_fail_percentage 30582 1726855310.66542: checking to see if all hosts have failed and the running result is not ok 30582 1726855310.66543: done checking to see if all hosts have failed 30582 1726855310.66544: getting the remaining hosts for this loop 30582 1726855310.66545: done getting the remaining hosts for this loop 30582 1726855310.66550: getting the next task for host managed_node3 30582 1726855310.66558: done getting next task for host managed_node3 30582 1726855310.66560: ^ task is: TASK: Success in test '{{ lsr_description }}' 30582 1726855310.66563: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855310.66567: getting variables 30582 1726855310.66569: in VariableManager get_vars() 30582 1726855310.66614: Calling all_inventory to load vars for managed_node3 30582 1726855310.66617: Calling groups_inventory to load vars for managed_node3 30582 1726855310.66621: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855310.66634: Calling all_plugins_play to load vars for managed_node3 30582 1726855310.66638: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855310.66642: Calling groups_plugins_play to load vars for managed_node3 30582 1726855310.67348: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a4f 30582 1726855310.67351: WORKER PROCESS EXITING 30582 1726855310.69922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855310.73392: done with get_vars() 30582 1726855310.73425: done getting variables 30582 1726855310.73610: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855310.73868: variable 'lsr_description' from source: include params TASK [Success in test 'I can activate an existing profile'] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 14:01:50 -0400 (0:00:00.159) 0:00:47.090 ****** 30582 1726855310.74019: entering _queue_task() for managed_node3/debug 30582 1726855310.74902: worker is 1 (out of 1 available) 30582 1726855310.74915: exiting _queue_task() for managed_node3/debug 30582 1726855310.74925: done queuing things up, now waiting for results queue to drain 30582 1726855310.74926: waiting for pending results... 30582 1726855310.75360: running TaskExecutor() for managed_node3/TASK: Success in test 'I can activate an existing profile' 30582 1726855310.75792: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a50 30582 1726855310.75799: variable 'ansible_search_path' from source: unknown 30582 1726855310.75803: variable 'ansible_search_path' from source: unknown 30582 1726855310.75807: calling self._execute() 30582 1726855310.75983: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.75986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.76081: variable 'omit' from source: magic vars 30582 1726855310.76772: variable 'ansible_distribution_major_version' from source: facts 30582 1726855310.76789: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855310.76930: variable 'omit' from source: magic vars 30582 1726855310.76937: variable 'omit' from source: magic vars 30582 1726855310.77126: variable 'lsr_description' from source: include params 30582 1726855310.77364: variable 'omit' from source: magic vars 30582 1726855310.77367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855310.77370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855310.77692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855310.77696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855310.77698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855310.77701: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855310.77703: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.77706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.77793: Set connection var ansible_timeout to 10 30582 1726855310.77796: Set connection var ansible_connection to ssh 30582 1726855310.77798: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855310.77800: Set connection var ansible_pipelining to False 30582 1726855310.77891: Set connection var ansible_shell_executable to /bin/sh 30582 1726855310.77894: Set connection var ansible_shell_type to sh 30582 1726855310.77918: variable 'ansible_shell_executable' from source: unknown 30582 1726855310.77922: variable 'ansible_connection' from source: unknown 30582 1726855310.77924: variable 'ansible_module_compression' from source: unknown 30582 1726855310.77931: variable 'ansible_shell_type' from source: unknown 30582 1726855310.77934: variable 'ansible_shell_executable' from source: unknown 30582 1726855310.77938: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.77942: variable 'ansible_pipelining' from source: unknown 30582 1726855310.77944: variable 'ansible_timeout' from source: unknown 30582 1726855310.77949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.78371: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855310.78375: variable 'omit' from source: magic vars 30582 1726855310.78377: starting attempt loop 30582 1726855310.78380: running the handler 30582 1726855310.78433: handler run complete 30582 1726855310.78448: attempt loop complete, returning result 30582 1726855310.78451: _execute() done 30582 1726855310.78454: dumping result to json 30582 1726855310.78456: done dumping result, returning 30582 1726855310.78464: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can activate an existing profile' [0affcc66-ac2b-aa83-7d57-000000000a50] 30582 1726855310.78474: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a50 30582 1726855310.78569: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a50 30582 1726855310.78573: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I can activate an existing profile' +++++ 30582 1726855310.78627: no more pending results, returning what we have 30582 1726855310.78632: results queue empty 30582 1726855310.78633: checking for any_errors_fatal 30582 1726855310.78640: done checking for any_errors_fatal 30582 1726855310.78641: checking for max_fail_percentage 30582 1726855310.78643: done checking for max_fail_percentage 30582 1726855310.78644: checking to see if all hosts have failed and the running result is not ok 30582 1726855310.78645: done checking to see if all hosts have failed 30582 1726855310.78645: getting the remaining hosts for this loop 30582 1726855310.78647: done getting the remaining hosts for this loop 30582 1726855310.78651: getting the next task for host managed_node3 30582 1726855310.78660: done getting next task for host managed_node3 30582 1726855310.78665: ^ task is: TASK: Cleanup 30582 1726855310.78668: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855310.78675: getting variables 30582 1726855310.78677: in VariableManager get_vars() 30582 1726855310.78715: Calling all_inventory to load vars for managed_node3 30582 1726855310.78718: Calling groups_inventory to load vars for managed_node3 30582 1726855310.78722: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855310.78733: Calling all_plugins_play to load vars for managed_node3 30582 1726855310.78737: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855310.78740: Calling groups_plugins_play to load vars for managed_node3 30582 1726855310.83251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855310.86681: done with get_vars() 30582 1726855310.86715: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 14:01:50 -0400 (0:00:00.129) 0:00:47.219 ****** 30582 1726855310.86952: entering _queue_task() for managed_node3/include_tasks 30582 1726855310.87734: worker is 1 (out of 1 available) 30582 1726855310.87748: exiting _queue_task() for managed_node3/include_tasks 30582 1726855310.87983: done queuing things up, now waiting for results queue to drain 30582 1726855310.87985: waiting for pending results... 30582 1726855310.88648: running TaskExecutor() for managed_node3/TASK: Cleanup 30582 1726855310.88654: in run() - task 0affcc66-ac2b-aa83-7d57-000000000a54 30582 1726855310.88744: variable 'ansible_search_path' from source: unknown 30582 1726855310.88747: variable 'ansible_search_path' from source: unknown 30582 1726855310.88750: variable 'lsr_cleanup' from source: include params 30582 1726855310.89238: variable 'lsr_cleanup' from source: include params 30582 1726855310.89422: variable 'omit' from source: magic vars 30582 1726855310.89749: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855310.89762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855310.89772: variable 'omit' from source: magic vars 30582 1726855310.90251: variable 'ansible_distribution_major_version' from source: facts 30582 1726855310.90494: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855310.90498: variable 'item' from source: unknown 30582 1726855310.90501: variable 'item' from source: unknown 30582 1726855310.90602: variable 'item' from source: unknown 30582 1726855310.90664: variable 'item' from source: unknown 30582 1726855310.90800: dumping result to json 30582 1726855310.90804: done dumping result, returning 30582 1726855310.90808: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcc66-ac2b-aa83-7d57-000000000a54] 30582 1726855310.90997: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a54 30582 1726855310.91039: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000a54 30582 1726855310.91043: WORKER PROCESS EXITING 30582 1726855310.91069: no more pending results, returning what we have 30582 1726855310.91076: in VariableManager get_vars() 30582 1726855310.91117: Calling all_inventory to load vars for managed_node3 30582 1726855310.91120: Calling groups_inventory to load vars for managed_node3 30582 1726855310.91124: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855310.91138: Calling all_plugins_play to load vars for managed_node3 30582 1726855310.91141: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855310.91144: Calling groups_plugins_play to load vars for managed_node3 30582 1726855310.94238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855310.97521: done with get_vars() 30582 1726855310.97548: variable 'ansible_search_path' from source: unknown 30582 1726855310.97549: variable 'ansible_search_path' from source: unknown 30582 1726855310.97709: we have included files to process 30582 1726855310.97711: generating all_blocks data 30582 1726855310.97713: done generating all_blocks data 30582 1726855310.97719: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855310.97720: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855310.97722: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855310.98091: done processing included file 30582 1726855310.98093: iterating over new_blocks loaded from include file 30582 1726855310.98095: in VariableManager get_vars() 30582 1726855310.98227: done with get_vars() 30582 1726855310.98229: filtering new block on tags 30582 1726855310.98256: done filtering new block on tags 30582 1726855310.98259: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30582 1726855310.98265: extending task lists for all hosts with included blocks 30582 1726855311.00718: done extending task lists 30582 1726855311.00720: done processing included files 30582 1726855311.00720: results queue empty 30582 1726855311.00721: checking for any_errors_fatal 30582 1726855311.00724: done checking for any_errors_fatal 30582 1726855311.00725: checking for max_fail_percentage 30582 1726855311.00726: done checking for max_fail_percentage 30582 1726855311.00727: checking to see if all hosts have failed and the running result is not ok 30582 1726855311.00728: done checking to see if all hosts have failed 30582 1726855311.00729: getting the remaining hosts for this loop 30582 1726855311.00730: done getting the remaining hosts for this loop 30582 1726855311.00732: getting the next task for host managed_node3 30582 1726855311.00737: done getting next task for host managed_node3 30582 1726855311.00739: ^ task is: TASK: Cleanup profile and device 30582 1726855311.00741: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855311.00744: getting variables 30582 1726855311.00745: in VariableManager get_vars() 30582 1726855311.00762: Calling all_inventory to load vars for managed_node3 30582 1726855311.00764: Calling groups_inventory to load vars for managed_node3 30582 1726855311.00767: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855311.00775: Calling all_plugins_play to load vars for managed_node3 30582 1726855311.00777: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855311.00780: Calling groups_plugins_play to load vars for managed_node3 30582 1726855311.02149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855311.04789: done with get_vars() 30582 1726855311.04820: done getting variables 30582 1726855311.04865: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 14:01:51 -0400 (0:00:00.179) 0:00:47.398 ****** 30582 1726855311.04908: entering _queue_task() for managed_node3/shell 30582 1726855311.05455: worker is 1 (out of 1 available) 30582 1726855311.05466: exiting _queue_task() for managed_node3/shell 30582 1726855311.05480: done queuing things up, now waiting for results queue to drain 30582 1726855311.05482: waiting for pending results... 30582 1726855311.05691: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30582 1726855311.05745: in run() - task 0affcc66-ac2b-aa83-7d57-000000000f6d 30582 1726855311.05758: variable 'ansible_search_path' from source: unknown 30582 1726855311.05762: variable 'ansible_search_path' from source: unknown 30582 1726855311.05991: calling self._execute() 30582 1726855311.05996: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.05998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.06001: variable 'omit' from source: magic vars 30582 1726855311.06366: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.06369: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.06372: variable 'omit' from source: magic vars 30582 1726855311.06408: variable 'omit' from source: magic vars 30582 1726855311.06704: variable 'interface' from source: play vars 30582 1726855311.06707: variable 'omit' from source: magic vars 30582 1726855311.06710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855311.06712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855311.06714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855311.06721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.06733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.06771: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855311.06774: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.06781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.06902: Set connection var ansible_timeout to 10 30582 1726855311.06906: Set connection var ansible_connection to ssh 30582 1726855311.07022: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855311.07025: Set connection var ansible_pipelining to False 30582 1726855311.07027: Set connection var ansible_shell_executable to /bin/sh 30582 1726855311.07030: Set connection var ansible_shell_type to sh 30582 1726855311.07032: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.07034: variable 'ansible_connection' from source: unknown 30582 1726855311.07036: variable 'ansible_module_compression' from source: unknown 30582 1726855311.07038: variable 'ansible_shell_type' from source: unknown 30582 1726855311.07040: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.07041: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.07043: variable 'ansible_pipelining' from source: unknown 30582 1726855311.07045: variable 'ansible_timeout' from source: unknown 30582 1726855311.07048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.07131: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.07140: variable 'omit' from source: magic vars 30582 1726855311.07145: starting attempt loop 30582 1726855311.07148: running the handler 30582 1726855311.07158: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.07180: _low_level_execute_command(): starting 30582 1726855311.07189: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855311.08197: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855311.08379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855311.08614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855311.10307: stdout chunk (state=3): >>>/root <<< 30582 1726855311.10593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855311.10597: stdout chunk (state=3): >>><<< 30582 1726855311.10600: stderr chunk (state=3): >>><<< 30582 1726855311.10602: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855311.10605: _low_level_execute_command(): starting 30582 1726855311.10608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192 `" && echo ansible-tmp-1726855311.1051238-32803-245076954293192="` echo /root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192 `" ) && sleep 0' 30582 1726855311.12310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855311.12412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855311.12439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855311.12617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855311.12826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855311.12889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855311.14799: stdout chunk (state=3): >>>ansible-tmp-1726855311.1051238-32803-245076954293192=/root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192 <<< 30582 1726855311.14943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855311.14958: stdout chunk (state=3): >>><<< 30582 1726855311.14971: stderr chunk (state=3): >>><<< 30582 1726855311.15001: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855311.1051238-32803-245076954293192=/root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855311.15042: variable 'ansible_module_compression' from source: unknown 30582 1726855311.15110: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855311.15311: variable 'ansible_facts' from source: unknown 30582 1726855311.15509: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/AnsiballZ_command.py 30582 1726855311.15908: Sending initial data 30582 1726855311.15911: Sent initial data (156 bytes) 30582 1726855311.17081: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855311.17179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855311.17331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855311.17394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855311.18949: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855311.19035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855311.19071: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp6xr3p728 /root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/AnsiballZ_command.py <<< 30582 1726855311.19081: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/AnsiballZ_command.py" <<< 30582 1726855311.19166: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp6xr3p728" to remote "/root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/AnsiballZ_command.py" <<< 30582 1726855311.20708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855311.20942: stderr chunk (state=3): >>><<< 30582 1726855311.20945: stdout chunk (state=3): >>><<< 30582 1726855311.20947: done transferring module to remote 30582 1726855311.20954: _low_level_execute_command(): starting 30582 1726855311.20957: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/ /root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/AnsiballZ_command.py && sleep 0' 30582 1726855311.22125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855311.22134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855311.22191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855311.22195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855311.22198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855311.22352: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855311.22384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855311.22692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855311.24337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855311.24341: stdout chunk (state=3): >>><<< 30582 1726855311.24355: stderr chunk (state=3): >>><<< 30582 1726855311.24377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855311.24380: _low_level_execute_command(): starting 30582 1726855311.24384: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/AnsiballZ_command.py && sleep 0' 30582 1726855311.26078: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855311.26167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855311.26214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855311.26218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855311.26405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855311.47212: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (7b764d37-80c8-473a-b5aa-e42b924ac508) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:01:51.415174", "end": "2024-09-20 14:01:51.469712", "delta": "0:00:00.054538", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855311.49581: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. <<< 30582 1726855311.49585: stdout chunk (state=3): >>><<< 30582 1726855311.49607: stderr chunk (state=3): >>><<< 30582 1726855311.49680: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (7b764d37-80c8-473a-b5aa-e42b924ac508) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:01:51.415174", "end": "2024-09-20 14:01:51.469712", "delta": "0:00:00.054538", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855311.49701: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855311.49714: _low_level_execute_command(): starting 30582 1726855311.49717: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855311.1051238-32803-245076954293192/ > /dev/null 2>&1 && sleep 0' 30582 1726855311.51507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855311.51565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855311.51657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855311.51853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855311.51966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855311.52018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855311.52108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855311.52248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855311.54254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855311.54593: stderr chunk (state=3): >>><<< 30582 1726855311.54596: stdout chunk (state=3): >>><<< 30582 1726855311.54599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855311.54601: handler run complete 30582 1726855311.54603: Evaluated conditional (False): False 30582 1726855311.54605: attempt loop complete, returning result 30582 1726855311.54607: _execute() done 30582 1726855311.54609: dumping result to json 30582 1726855311.54611: done dumping result, returning 30582 1726855311.54612: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcc66-ac2b-aa83-7d57-000000000f6d] 30582 1726855311.54614: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f6d fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.054538", "end": "2024-09-20 14:01:51.469712", "rc": 1, "start": "2024-09-20 14:01:51.415174" } STDOUT: Connection 'statebr' (7b764d37-80c8-473a-b5aa-e42b924ac508) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30582 1726855311.54764: no more pending results, returning what we have 30582 1726855311.54768: results queue empty 30582 1726855311.54769: checking for any_errors_fatal 30582 1726855311.54770: done checking for any_errors_fatal 30582 1726855311.54771: checking for max_fail_percentage 30582 1726855311.54773: done checking for max_fail_percentage 30582 1726855311.54774: checking to see if all hosts have failed and the running result is not ok 30582 1726855311.54775: done checking to see if all hosts have failed 30582 1726855311.54776: getting the remaining hosts for this loop 30582 1726855311.54778: done getting the remaining hosts for this loop 30582 1726855311.54781: getting the next task for host managed_node3 30582 1726855311.54858: done getting next task for host managed_node3 30582 1726855311.54861: ^ task is: TASK: Include the task 'run_test.yml' 30582 1726855311.54863: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855311.54867: getting variables 30582 1726855311.54869: in VariableManager get_vars() 30582 1726855311.54906: Calling all_inventory to load vars for managed_node3 30582 1726855311.54908: Calling groups_inventory to load vars for managed_node3 30582 1726855311.54911: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855311.54917: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000f6d 30582 1726855311.54921: WORKER PROCESS EXITING 30582 1726855311.55034: Calling all_plugins_play to load vars for managed_node3 30582 1726855311.55038: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855311.55042: Calling groups_plugins_play to load vars for managed_node3 30582 1726855311.57824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855311.61166: done with get_vars() 30582 1726855311.61191: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:83 Friday 20 September 2024 14:01:51 -0400 (0:00:00.563) 0:00:47.962 ****** 30582 1726855311.61285: entering _queue_task() for managed_node3/include_tasks 30582 1726855311.61647: worker is 1 (out of 1 available) 30582 1726855311.61660: exiting _queue_task() for managed_node3/include_tasks 30582 1726855311.61671: done queuing things up, now waiting for results queue to drain 30582 1726855311.61673: waiting for pending results... 30582 1726855311.61969: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30582 1726855311.62059: in run() - task 0affcc66-ac2b-aa83-7d57-000000000013 30582 1726855311.62072: variable 'ansible_search_path' from source: unknown 30582 1726855311.62110: calling self._execute() 30582 1726855311.62204: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.62207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.62275: variable 'omit' from source: magic vars 30582 1726855311.62598: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.62607: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.62613: _execute() done 30582 1726855311.62616: dumping result to json 30582 1726855311.62619: done dumping result, returning 30582 1726855311.62627: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcc66-ac2b-aa83-7d57-000000000013] 30582 1726855311.62633: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000013 30582 1726855311.62855: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000013 30582 1726855311.62858: WORKER PROCESS EXITING 30582 1726855311.63005: no more pending results, returning what we have 30582 1726855311.63009: in VariableManager get_vars() 30582 1726855311.63041: Calling all_inventory to load vars for managed_node3 30582 1726855311.63044: Calling groups_inventory to load vars for managed_node3 30582 1726855311.63047: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855311.63057: Calling all_plugins_play to load vars for managed_node3 30582 1726855311.63060: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855311.63064: Calling groups_plugins_play to load vars for managed_node3 30582 1726855311.64844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855311.66526: done with get_vars() 30582 1726855311.66550: variable 'ansible_search_path' from source: unknown 30582 1726855311.66565: we have included files to process 30582 1726855311.66566: generating all_blocks data 30582 1726855311.66568: done generating all_blocks data 30582 1726855311.66575: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855311.66576: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855311.66578: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855311.66973: in VariableManager get_vars() 30582 1726855311.66993: done with get_vars() 30582 1726855311.67034: in VariableManager get_vars() 30582 1726855311.67053: done with get_vars() 30582 1726855311.67096: in VariableManager get_vars() 30582 1726855311.67115: done with get_vars() 30582 1726855311.67156: in VariableManager get_vars() 30582 1726855311.67173: done with get_vars() 30582 1726855311.67214: in VariableManager get_vars() 30582 1726855311.67230: done with get_vars() 30582 1726855311.67605: in VariableManager get_vars() 30582 1726855311.67623: done with get_vars() 30582 1726855311.67636: done processing included file 30582 1726855311.67638: iterating over new_blocks loaded from include file 30582 1726855311.67639: in VariableManager get_vars() 30582 1726855311.67651: done with get_vars() 30582 1726855311.67652: filtering new block on tags 30582 1726855311.67769: done filtering new block on tags 30582 1726855311.67772: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30582 1726855311.67777: extending task lists for all hosts with included blocks 30582 1726855311.67881: done extending task lists 30582 1726855311.67883: done processing included files 30582 1726855311.67884: results queue empty 30582 1726855311.67884: checking for any_errors_fatal 30582 1726855311.67892: done checking for any_errors_fatal 30582 1726855311.67893: checking for max_fail_percentage 30582 1726855311.67894: done checking for max_fail_percentage 30582 1726855311.67895: checking to see if all hosts have failed and the running result is not ok 30582 1726855311.67896: done checking to see if all hosts have failed 30582 1726855311.67896: getting the remaining hosts for this loop 30582 1726855311.67898: done getting the remaining hosts for this loop 30582 1726855311.67901: getting the next task for host managed_node3 30582 1726855311.67904: done getting next task for host managed_node3 30582 1726855311.67906: ^ task is: TASK: TEST: {{ lsr_description }} 30582 1726855311.67909: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855311.67911: getting variables 30582 1726855311.67912: in VariableManager get_vars() 30582 1726855311.67921: Calling all_inventory to load vars for managed_node3 30582 1726855311.67923: Calling groups_inventory to load vars for managed_node3 30582 1726855311.67925: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855311.67931: Calling all_plugins_play to load vars for managed_node3 30582 1726855311.67933: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855311.67936: Calling groups_plugins_play to load vars for managed_node3 30582 1726855311.70534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855311.73708: done with get_vars() 30582 1726855311.73742: done getting variables 30582 1726855311.73794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855311.74110: variable 'lsr_description' from source: include params TASK [TEST: I can remove an existing profile without taking it down] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 14:01:51 -0400 (0:00:00.128) 0:00:48.091 ****** 30582 1726855311.74141: entering _queue_task() for managed_node3/debug 30582 1726855311.74707: worker is 1 (out of 1 available) 30582 1726855311.74717: exiting _queue_task() for managed_node3/debug 30582 1726855311.74729: done queuing things up, now waiting for results queue to drain 30582 1726855311.74731: waiting for pending results... 30582 1726855311.75400: running TaskExecutor() for managed_node3/TASK: TEST: I can remove an existing profile without taking it down 30582 1726855311.75543: in run() - task 0affcc66-ac2b-aa83-7d57-000000001005 30582 1726855311.75610: variable 'ansible_search_path' from source: unknown 30582 1726855311.75618: variable 'ansible_search_path' from source: unknown 30582 1726855311.75722: calling self._execute() 30582 1726855311.75978: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.75994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.76012: variable 'omit' from source: magic vars 30582 1726855311.76914: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.76930: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.77051: variable 'omit' from source: magic vars 30582 1726855311.77063: variable 'omit' from source: magic vars 30582 1726855311.77298: variable 'lsr_description' from source: include params 30582 1726855311.77497: variable 'omit' from source: magic vars 30582 1726855311.77501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855311.77503: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855311.77576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855311.77629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.77685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.77760: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855311.77855: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.77858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.78036: Set connection var ansible_timeout to 10 30582 1726855311.78080: Set connection var ansible_connection to ssh 30582 1726855311.78098: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855311.78132: Set connection var ansible_pipelining to False 30582 1726855311.78152: Set connection var ansible_shell_executable to /bin/sh 30582 1726855311.78268: Set connection var ansible_shell_type to sh 30582 1726855311.78270: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.78275: variable 'ansible_connection' from source: unknown 30582 1726855311.78277: variable 'ansible_module_compression' from source: unknown 30582 1726855311.78278: variable 'ansible_shell_type' from source: unknown 30582 1726855311.78280: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.78281: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.78283: variable 'ansible_pipelining' from source: unknown 30582 1726855311.78284: variable 'ansible_timeout' from source: unknown 30582 1726855311.78286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.78619: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.78637: variable 'omit' from source: magic vars 30582 1726855311.78692: starting attempt loop 30582 1726855311.78778: running the handler 30582 1726855311.78819: handler run complete 30582 1726855311.78838: attempt loop complete, returning result 30582 1726855311.78877: _execute() done 30582 1726855311.78892: dumping result to json 30582 1726855311.78996: done dumping result, returning 30582 1726855311.78999: done running TaskExecutor() for managed_node3/TASK: TEST: I can remove an existing profile without taking it down [0affcc66-ac2b-aa83-7d57-000000001005] 30582 1726855311.79002: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001005 30582 1726855311.79091: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001005 30582 1726855311.79094: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I can remove an existing profile without taking it down ########## 30582 1726855311.79334: no more pending results, returning what we have 30582 1726855311.79338: results queue empty 30582 1726855311.79339: checking for any_errors_fatal 30582 1726855311.79340: done checking for any_errors_fatal 30582 1726855311.79341: checking for max_fail_percentage 30582 1726855311.79342: done checking for max_fail_percentage 30582 1726855311.79343: checking to see if all hosts have failed and the running result is not ok 30582 1726855311.79344: done checking to see if all hosts have failed 30582 1726855311.79344: getting the remaining hosts for this loop 30582 1726855311.79346: done getting the remaining hosts for this loop 30582 1726855311.79349: getting the next task for host managed_node3 30582 1726855311.79355: done getting next task for host managed_node3 30582 1726855311.79357: ^ task is: TASK: Show item 30582 1726855311.79360: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855311.79363: getting variables 30582 1726855311.79365: in VariableManager get_vars() 30582 1726855311.79396: Calling all_inventory to load vars for managed_node3 30582 1726855311.79399: Calling groups_inventory to load vars for managed_node3 30582 1726855311.79402: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855311.79411: Calling all_plugins_play to load vars for managed_node3 30582 1726855311.79413: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855311.79416: Calling groups_plugins_play to load vars for managed_node3 30582 1726855311.81909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855311.85458: done with get_vars() 30582 1726855311.85517: done getting variables 30582 1726855311.85619: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 14:01:51 -0400 (0:00:00.115) 0:00:48.206 ****** 30582 1726855311.85664: entering _queue_task() for managed_node3/debug 30582 1726855311.86105: worker is 1 (out of 1 available) 30582 1726855311.86117: exiting _queue_task() for managed_node3/debug 30582 1726855311.86128: done queuing things up, now waiting for results queue to drain 30582 1726855311.86129: waiting for pending results... 30582 1726855311.86492: running TaskExecutor() for managed_node3/TASK: Show item 30582 1726855311.86535: in run() - task 0affcc66-ac2b-aa83-7d57-000000001006 30582 1726855311.86584: variable 'ansible_search_path' from source: unknown 30582 1726855311.86590: variable 'ansible_search_path' from source: unknown 30582 1726855311.86618: variable 'omit' from source: magic vars 30582 1726855311.86792: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.86800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.86804: variable 'omit' from source: magic vars 30582 1726855311.87163: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.87177: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.87180: variable 'omit' from source: magic vars 30582 1726855311.87239: variable 'omit' from source: magic vars 30582 1726855311.87278: variable 'item' from source: unknown 30582 1726855311.87363: variable 'item' from source: unknown 30582 1726855311.87366: variable 'omit' from source: magic vars 30582 1726855311.87403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855311.87481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855311.87484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855311.87486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.87498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.87528: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855311.87532: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.87535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.88093: Set connection var ansible_timeout to 10 30582 1726855311.88096: Set connection var ansible_connection to ssh 30582 1726855311.88098: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855311.88100: Set connection var ansible_pipelining to False 30582 1726855311.88102: Set connection var ansible_shell_executable to /bin/sh 30582 1726855311.88104: Set connection var ansible_shell_type to sh 30582 1726855311.88107: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.88108: variable 'ansible_connection' from source: unknown 30582 1726855311.88110: variable 'ansible_module_compression' from source: unknown 30582 1726855311.88112: variable 'ansible_shell_type' from source: unknown 30582 1726855311.88114: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.88116: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.88117: variable 'ansible_pipelining' from source: unknown 30582 1726855311.88120: variable 'ansible_timeout' from source: unknown 30582 1726855311.88122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.88125: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.88127: variable 'omit' from source: magic vars 30582 1726855311.88129: starting attempt loop 30582 1726855311.88131: running the handler 30582 1726855311.88163: variable 'lsr_description' from source: include params 30582 1726855311.88167: variable 'lsr_description' from source: include params 30582 1726855311.88169: handler run complete 30582 1726855311.88171: attempt loop complete, returning result 30582 1726855311.88175: variable 'item' from source: unknown 30582 1726855311.88177: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can remove an existing profile without taking it down" } 30582 1726855311.88505: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.88508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.88511: variable 'omit' from source: magic vars 30582 1726855311.88747: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.88753: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.88757: variable 'omit' from source: magic vars 30582 1726855311.88771: variable 'omit' from source: magic vars 30582 1726855311.88835: variable 'item' from source: unknown 30582 1726855311.89022: variable 'item' from source: unknown 30582 1726855311.89038: variable 'omit' from source: magic vars 30582 1726855311.89056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855311.89065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.89071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.89083: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855311.89086: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.89090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.89533: Set connection var ansible_timeout to 10 30582 1726855311.89537: Set connection var ansible_connection to ssh 30582 1726855311.89539: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855311.89541: Set connection var ansible_pipelining to False 30582 1726855311.89543: Set connection var ansible_shell_executable to /bin/sh 30582 1726855311.89545: Set connection var ansible_shell_type to sh 30582 1726855311.89547: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.89549: variable 'ansible_connection' from source: unknown 30582 1726855311.89551: variable 'ansible_module_compression' from source: unknown 30582 1726855311.89552: variable 'ansible_shell_type' from source: unknown 30582 1726855311.89554: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.89556: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.89558: variable 'ansible_pipelining' from source: unknown 30582 1726855311.89560: variable 'ansible_timeout' from source: unknown 30582 1726855311.89562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.89564: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.89566: variable 'omit' from source: magic vars 30582 1726855311.89568: starting attempt loop 30582 1726855311.89570: running the handler 30582 1726855311.89678: variable 'lsr_setup' from source: include params 30582 1726855311.89742: variable 'lsr_setup' from source: include params 30582 1726855311.89816: handler run complete 30582 1726855311.89834: attempt loop complete, returning result 30582 1726855311.89850: variable 'item' from source: unknown 30582 1726855311.89930: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml" ] } 30582 1726855311.90027: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.90031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.90034: variable 'omit' from source: magic vars 30582 1726855311.90306: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.90310: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.90312: variable 'omit' from source: magic vars 30582 1726855311.90314: variable 'omit' from source: magic vars 30582 1726855311.90317: variable 'item' from source: unknown 30582 1726855311.90348: variable 'item' from source: unknown 30582 1726855311.90367: variable 'omit' from source: magic vars 30582 1726855311.90388: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855311.90392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.90399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.90414: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855311.90420: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.90423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.90499: Set connection var ansible_timeout to 10 30582 1726855311.90503: Set connection var ansible_connection to ssh 30582 1726855311.90512: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855311.90515: Set connection var ansible_pipelining to False 30582 1726855311.90522: Set connection var ansible_shell_executable to /bin/sh 30582 1726855311.90528: Set connection var ansible_shell_type to sh 30582 1726855311.90545: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.90548: variable 'ansible_connection' from source: unknown 30582 1726855311.90551: variable 'ansible_module_compression' from source: unknown 30582 1726855311.90553: variable 'ansible_shell_type' from source: unknown 30582 1726855311.90555: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.90557: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.90563: variable 'ansible_pipelining' from source: unknown 30582 1726855311.90565: variable 'ansible_timeout' from source: unknown 30582 1726855311.90632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.90669: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.90678: variable 'omit' from source: magic vars 30582 1726855311.90680: starting attempt loop 30582 1726855311.90683: running the handler 30582 1726855311.90731: variable 'lsr_test' from source: include params 30582 1726855311.90814: variable 'lsr_test' from source: include params 30582 1726855311.90833: handler run complete 30582 1726855311.90848: attempt loop complete, returning result 30582 1726855311.90866: variable 'item' from source: unknown 30582 1726855311.90928: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove_profile.yml" ] } 30582 1726855311.91105: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.91109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.91112: variable 'omit' from source: magic vars 30582 1726855311.91285: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.91289: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.91292: variable 'omit' from source: magic vars 30582 1726855311.91294: variable 'omit' from source: magic vars 30582 1726855311.91296: variable 'item' from source: unknown 30582 1726855311.91311: variable 'item' from source: unknown 30582 1726855311.91324: variable 'omit' from source: magic vars 30582 1726855311.91345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855311.91352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.91358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.91369: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855311.91372: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.91377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.91457: Set connection var ansible_timeout to 10 30582 1726855311.91461: Set connection var ansible_connection to ssh 30582 1726855311.91502: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855311.91505: Set connection var ansible_pipelining to False 30582 1726855311.91507: Set connection var ansible_shell_executable to /bin/sh 30582 1726855311.91510: Set connection var ansible_shell_type to sh 30582 1726855311.91512: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.91514: variable 'ansible_connection' from source: unknown 30582 1726855311.91516: variable 'ansible_module_compression' from source: unknown 30582 1726855311.91518: variable 'ansible_shell_type' from source: unknown 30582 1726855311.91520: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.91522: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.91524: variable 'ansible_pipelining' from source: unknown 30582 1726855311.91526: variable 'ansible_timeout' from source: unknown 30582 1726855311.91528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.91719: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.91723: variable 'omit' from source: magic vars 30582 1726855311.91733: starting attempt loop 30582 1726855311.91741: running the handler 30582 1726855311.91747: variable 'lsr_assert' from source: include params 30582 1726855311.91749: variable 'lsr_assert' from source: include params 30582 1726855311.91755: handler run complete 30582 1726855311.91760: attempt loop complete, returning result 30582 1726855311.91766: variable 'item' from source: unknown 30582 1726855311.91858: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_absent.yml" ] } 30582 1726855311.92201: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.92206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.92209: variable 'omit' from source: magic vars 30582 1726855311.92527: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.92534: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.92541: variable 'omit' from source: magic vars 30582 1726855311.92556: variable 'omit' from source: magic vars 30582 1726855311.92630: variable 'item' from source: unknown 30582 1726855311.92719: variable 'item' from source: unknown 30582 1726855311.92730: variable 'omit' from source: magic vars 30582 1726855311.92749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855311.92765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.92772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.92806: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855311.92808: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.92811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.92909: Set connection var ansible_timeout to 10 30582 1726855311.92912: Set connection var ansible_connection to ssh 30582 1726855311.92914: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855311.92916: Set connection var ansible_pipelining to False 30582 1726855311.92918: Set connection var ansible_shell_executable to /bin/sh 30582 1726855311.92920: Set connection var ansible_shell_type to sh 30582 1726855311.92944: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.92953: variable 'ansible_connection' from source: unknown 30582 1726855311.92955: variable 'ansible_module_compression' from source: unknown 30582 1726855311.92958: variable 'ansible_shell_type' from source: unknown 30582 1726855311.92960: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.92962: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.92967: variable 'ansible_pipelining' from source: unknown 30582 1726855311.92969: variable 'ansible_timeout' from source: unknown 30582 1726855311.92975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.93064: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.93067: variable 'omit' from source: magic vars 30582 1726855311.93070: starting attempt loop 30582 1726855311.93072: running the handler 30582 1726855311.93269: handler run complete 30582 1726855311.93324: attempt loop complete, returning result 30582 1726855311.93329: variable 'item' from source: unknown 30582 1726855311.93351: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30582 1726855311.93504: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.93507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.93511: variable 'omit' from source: magic vars 30582 1726855311.93774: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.93778: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.93781: variable 'omit' from source: magic vars 30582 1726855311.93783: variable 'omit' from source: magic vars 30582 1726855311.93785: variable 'item' from source: unknown 30582 1726855311.93789: variable 'item' from source: unknown 30582 1726855311.93791: variable 'omit' from source: magic vars 30582 1726855311.93793: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855311.93809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.93812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.93814: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855311.93828: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.93909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.93958: Set connection var ansible_timeout to 10 30582 1726855311.93961: Set connection var ansible_connection to ssh 30582 1726855311.93968: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855311.93980: Set connection var ansible_pipelining to False 30582 1726855311.93983: Set connection var ansible_shell_executable to /bin/sh 30582 1726855311.93985: Set connection var ansible_shell_type to sh 30582 1726855311.94091: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.94094: variable 'ansible_connection' from source: unknown 30582 1726855311.94096: variable 'ansible_module_compression' from source: unknown 30582 1726855311.94099: variable 'ansible_shell_type' from source: unknown 30582 1726855311.94101: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.94103: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.94105: variable 'ansible_pipelining' from source: unknown 30582 1726855311.94107: variable 'ansible_timeout' from source: unknown 30582 1726855311.94109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.94111: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.94113: variable 'omit' from source: magic vars 30582 1726855311.94115: starting attempt loop 30582 1726855311.94117: running the handler 30582 1726855311.94127: variable 'lsr_fail_debug' from source: play vars 30582 1726855311.94198: variable 'lsr_fail_debug' from source: play vars 30582 1726855311.94221: handler run complete 30582 1726855311.94233: attempt loop complete, returning result 30582 1726855311.94245: variable 'item' from source: unknown 30582 1726855311.94309: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30582 1726855311.94494: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.94498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.94500: variable 'omit' from source: magic vars 30582 1726855311.94551: variable 'ansible_distribution_major_version' from source: facts 30582 1726855311.94557: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855311.94561: variable 'omit' from source: magic vars 30582 1726855311.94577: variable 'omit' from source: magic vars 30582 1726855311.94627: variable 'item' from source: unknown 30582 1726855311.94680: variable 'item' from source: unknown 30582 1726855311.94708: variable 'omit' from source: magic vars 30582 1726855311.94735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855311.94739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.94745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855311.94844: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855311.94847: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.94849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.94952: Set connection var ansible_timeout to 10 30582 1726855311.94955: Set connection var ansible_connection to ssh 30582 1726855311.94957: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855311.94959: Set connection var ansible_pipelining to False 30582 1726855311.94961: Set connection var ansible_shell_executable to /bin/sh 30582 1726855311.94964: Set connection var ansible_shell_type to sh 30582 1726855311.94966: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.94968: variable 'ansible_connection' from source: unknown 30582 1726855311.94970: variable 'ansible_module_compression' from source: unknown 30582 1726855311.94971: variable 'ansible_shell_type' from source: unknown 30582 1726855311.94976: variable 'ansible_shell_executable' from source: unknown 30582 1726855311.94978: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855311.94980: variable 'ansible_pipelining' from source: unknown 30582 1726855311.94982: variable 'ansible_timeout' from source: unknown 30582 1726855311.94984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855311.95123: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855311.95126: variable 'omit' from source: magic vars 30582 1726855311.95129: starting attempt loop 30582 1726855311.95131: running the handler 30582 1726855311.95133: variable 'lsr_cleanup' from source: include params 30582 1726855311.95169: variable 'lsr_cleanup' from source: include params 30582 1726855311.95225: handler run complete 30582 1726855311.95405: attempt loop complete, returning result 30582 1726855311.95408: variable 'item' from source: unknown 30582 1726855311.95410: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30582 1726855311.95460: dumping result to json 30582 1726855311.95463: done dumping result, returning 30582 1726855311.95465: done running TaskExecutor() for managed_node3/TASK: Show item [0affcc66-ac2b-aa83-7d57-000000001006] 30582 1726855311.95467: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001006 30582 1726855311.95559: no more pending results, returning what we have 30582 1726855311.95563: results queue empty 30582 1726855311.95564: checking for any_errors_fatal 30582 1726855311.95571: done checking for any_errors_fatal 30582 1726855311.95571: checking for max_fail_percentage 30582 1726855311.95573: done checking for max_fail_percentage 30582 1726855311.95574: checking to see if all hosts have failed and the running result is not ok 30582 1726855311.95575: done checking to see if all hosts have failed 30582 1726855311.95575: getting the remaining hosts for this loop 30582 1726855311.95577: done getting the remaining hosts for this loop 30582 1726855311.95581: getting the next task for host managed_node3 30582 1726855311.95641: done getting next task for host managed_node3 30582 1726855311.95645: ^ task is: TASK: Include the task 'show_interfaces.yml' 30582 1726855311.95648: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855311.95653: getting variables 30582 1726855311.95655: in VariableManager get_vars() 30582 1726855311.95691: Calling all_inventory to load vars for managed_node3 30582 1726855311.95698: Calling groups_inventory to load vars for managed_node3 30582 1726855311.95702: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855311.95720: Calling all_plugins_play to load vars for managed_node3 30582 1726855311.95725: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855311.95729: Calling groups_plugins_play to load vars for managed_node3 30582 1726855311.96617: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001006 30582 1726855311.96620: WORKER PROCESS EXITING 30582 1726855311.99206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855312.00800: done with get_vars() 30582 1726855312.00872: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 14:01:52 -0400 (0:00:00.153) 0:00:48.360 ****** 30582 1726855312.01056: entering _queue_task() for managed_node3/include_tasks 30582 1726855312.01601: worker is 1 (out of 1 available) 30582 1726855312.01613: exiting _queue_task() for managed_node3/include_tasks 30582 1726855312.01627: done queuing things up, now waiting for results queue to drain 30582 1726855312.01629: waiting for pending results... 30582 1726855312.02608: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30582 1726855312.02997: in run() - task 0affcc66-ac2b-aa83-7d57-000000001007 30582 1726855312.03002: variable 'ansible_search_path' from source: unknown 30582 1726855312.03004: variable 'ansible_search_path' from source: unknown 30582 1726855312.03039: calling self._execute() 30582 1726855312.03280: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.03328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.03344: variable 'omit' from source: magic vars 30582 1726855312.04214: variable 'ansible_distribution_major_version' from source: facts 30582 1726855312.04237: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855312.04248: _execute() done 30582 1726855312.04255: dumping result to json 30582 1726855312.04264: done dumping result, returning 30582 1726855312.04275: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcc66-ac2b-aa83-7d57-000000001007] 30582 1726855312.04309: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001007 30582 1726855312.04531: no more pending results, returning what we have 30582 1726855312.04536: in VariableManager get_vars() 30582 1726855312.04582: Calling all_inventory to load vars for managed_node3 30582 1726855312.04584: Calling groups_inventory to load vars for managed_node3 30582 1726855312.04589: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855312.04604: Calling all_plugins_play to load vars for managed_node3 30582 1726855312.04608: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855312.04611: Calling groups_plugins_play to load vars for managed_node3 30582 1726855312.05806: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001007 30582 1726855312.05810: WORKER PROCESS EXITING 30582 1726855312.08166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855312.11862: done with get_vars() 30582 1726855312.11926: variable 'ansible_search_path' from source: unknown 30582 1726855312.11927: variable 'ansible_search_path' from source: unknown 30582 1726855312.11969: we have included files to process 30582 1726855312.11971: generating all_blocks data 30582 1726855312.11975: done generating all_blocks data 30582 1726855312.11980: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855312.11982: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855312.12096: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855312.12280: in VariableManager get_vars() 30582 1726855312.12345: done with get_vars() 30582 1726855312.12642: done processing included file 30582 1726855312.12645: iterating over new_blocks loaded from include file 30582 1726855312.12646: in VariableManager get_vars() 30582 1726855312.12663: done with get_vars() 30582 1726855312.12665: filtering new block on tags 30582 1726855312.12764: done filtering new block on tags 30582 1726855312.12767: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30582 1726855312.12775: extending task lists for all hosts with included blocks 30582 1726855312.13782: done extending task lists 30582 1726855312.13784: done processing included files 30582 1726855312.13785: results queue empty 30582 1726855312.13786: checking for any_errors_fatal 30582 1726855312.13793: done checking for any_errors_fatal 30582 1726855312.13794: checking for max_fail_percentage 30582 1726855312.13795: done checking for max_fail_percentage 30582 1726855312.13796: checking to see if all hosts have failed and the running result is not ok 30582 1726855312.13797: done checking to see if all hosts have failed 30582 1726855312.13797: getting the remaining hosts for this loop 30582 1726855312.13799: done getting the remaining hosts for this loop 30582 1726855312.13801: getting the next task for host managed_node3 30582 1726855312.13806: done getting next task for host managed_node3 30582 1726855312.13808: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30582 1726855312.13811: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855312.13814: getting variables 30582 1726855312.13815: in VariableManager get_vars() 30582 1726855312.13826: Calling all_inventory to load vars for managed_node3 30582 1726855312.13828: Calling groups_inventory to load vars for managed_node3 30582 1726855312.13946: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855312.13952: Calling all_plugins_play to load vars for managed_node3 30582 1726855312.13955: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855312.13958: Calling groups_plugins_play to load vars for managed_node3 30582 1726855312.16485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855312.18872: done with get_vars() 30582 1726855312.18901: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 14:01:52 -0400 (0:00:00.179) 0:00:48.539 ****** 30582 1726855312.18991: entering _queue_task() for managed_node3/include_tasks 30582 1726855312.19383: worker is 1 (out of 1 available) 30582 1726855312.19401: exiting _queue_task() for managed_node3/include_tasks 30582 1726855312.19415: done queuing things up, now waiting for results queue to drain 30582 1726855312.19417: waiting for pending results... 30582 1726855312.19677: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30582 1726855312.19777: in run() - task 0affcc66-ac2b-aa83-7d57-00000000102e 30582 1726855312.19799: variable 'ansible_search_path' from source: unknown 30582 1726855312.19804: variable 'ansible_search_path' from source: unknown 30582 1726855312.19837: calling self._execute() 30582 1726855312.19994: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.19997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.20000: variable 'omit' from source: magic vars 30582 1726855312.20322: variable 'ansible_distribution_major_version' from source: facts 30582 1726855312.20334: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855312.20345: _execute() done 30582 1726855312.20351: dumping result to json 30582 1726855312.20355: done dumping result, returning 30582 1726855312.20357: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcc66-ac2b-aa83-7d57-00000000102e] 30582 1726855312.20360: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000102e 30582 1726855312.20693: no more pending results, returning what we have 30582 1726855312.20698: in VariableManager get_vars() 30582 1726855312.20729: Calling all_inventory to load vars for managed_node3 30582 1726855312.20732: Calling groups_inventory to load vars for managed_node3 30582 1726855312.20735: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855312.20745: Calling all_plugins_play to load vars for managed_node3 30582 1726855312.20748: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855312.20751: Calling groups_plugins_play to load vars for managed_node3 30582 1726855312.21521: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000102e 30582 1726855312.21524: WORKER PROCESS EXITING 30582 1726855312.22795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855312.24763: done with get_vars() 30582 1726855312.24837: variable 'ansible_search_path' from source: unknown 30582 1726855312.24839: variable 'ansible_search_path' from source: unknown 30582 1726855312.24909: we have included files to process 30582 1726855312.24911: generating all_blocks data 30582 1726855312.24912: done generating all_blocks data 30582 1726855312.24913: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855312.24915: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855312.24917: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855312.25516: done processing included file 30582 1726855312.25518: iterating over new_blocks loaded from include file 30582 1726855312.25520: in VariableManager get_vars() 30582 1726855312.25537: done with get_vars() 30582 1726855312.25539: filtering new block on tags 30582 1726855312.25580: done filtering new block on tags 30582 1726855312.25583: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30582 1726855312.25590: extending task lists for all hosts with included blocks 30582 1726855312.26014: done extending task lists 30582 1726855312.26015: done processing included files 30582 1726855312.26016: results queue empty 30582 1726855312.26017: checking for any_errors_fatal 30582 1726855312.26020: done checking for any_errors_fatal 30582 1726855312.26021: checking for max_fail_percentage 30582 1726855312.26022: done checking for max_fail_percentage 30582 1726855312.26022: checking to see if all hosts have failed and the running result is not ok 30582 1726855312.26023: done checking to see if all hosts have failed 30582 1726855312.26024: getting the remaining hosts for this loop 30582 1726855312.26025: done getting the remaining hosts for this loop 30582 1726855312.26054: getting the next task for host managed_node3 30582 1726855312.26061: done getting next task for host managed_node3 30582 1726855312.26064: ^ task is: TASK: Gather current interface info 30582 1726855312.26067: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855312.26070: getting variables 30582 1726855312.26071: in VariableManager get_vars() 30582 1726855312.26090: Calling all_inventory to load vars for managed_node3 30582 1726855312.26093: Calling groups_inventory to load vars for managed_node3 30582 1726855312.26096: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855312.26102: Calling all_plugins_play to load vars for managed_node3 30582 1726855312.26104: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855312.26107: Calling groups_plugins_play to load vars for managed_node3 30582 1726855312.27496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855312.29171: done with get_vars() 30582 1726855312.29204: done getting variables 30582 1726855312.29251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 14:01:52 -0400 (0:00:00.103) 0:00:48.642 ****** 30582 1726855312.29296: entering _queue_task() for managed_node3/command 30582 1726855312.29840: worker is 1 (out of 1 available) 30582 1726855312.29856: exiting _queue_task() for managed_node3/command 30582 1726855312.29869: done queuing things up, now waiting for results queue to drain 30582 1726855312.29871: waiting for pending results... 30582 1726855312.30178: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30582 1726855312.30394: in run() - task 0affcc66-ac2b-aa83-7d57-000000001069 30582 1726855312.30398: variable 'ansible_search_path' from source: unknown 30582 1726855312.30400: variable 'ansible_search_path' from source: unknown 30582 1726855312.30404: calling self._execute() 30582 1726855312.30447: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.30451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.30467: variable 'omit' from source: magic vars 30582 1726855312.30857: variable 'ansible_distribution_major_version' from source: facts 30582 1726855312.30862: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855312.31085: variable 'omit' from source: magic vars 30582 1726855312.31091: variable 'omit' from source: magic vars 30582 1726855312.31094: variable 'omit' from source: magic vars 30582 1726855312.31097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855312.31100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855312.31102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855312.31105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855312.31107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855312.31161: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855312.31164: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.31167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.31458: Set connection var ansible_timeout to 10 30582 1726855312.31462: Set connection var ansible_connection to ssh 30582 1726855312.31464: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855312.31466: Set connection var ansible_pipelining to False 30582 1726855312.31468: Set connection var ansible_shell_executable to /bin/sh 30582 1726855312.31470: Set connection var ansible_shell_type to sh 30582 1726855312.31472: variable 'ansible_shell_executable' from source: unknown 30582 1726855312.31477: variable 'ansible_connection' from source: unknown 30582 1726855312.31479: variable 'ansible_module_compression' from source: unknown 30582 1726855312.31481: variable 'ansible_shell_type' from source: unknown 30582 1726855312.31482: variable 'ansible_shell_executable' from source: unknown 30582 1726855312.31484: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.31485: variable 'ansible_pipelining' from source: unknown 30582 1726855312.31490: variable 'ansible_timeout' from source: unknown 30582 1726855312.31492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.31495: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855312.31597: variable 'omit' from source: magic vars 30582 1726855312.31603: starting attempt loop 30582 1726855312.31605: running the handler 30582 1726855312.31631: _low_level_execute_command(): starting 30582 1726855312.31639: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855312.32958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855312.33009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855312.33041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855312.33102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855312.33183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855312.33216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855312.33519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855312.35039: stdout chunk (state=3): >>>/root <<< 30582 1726855312.35221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855312.35244: stdout chunk (state=3): >>><<< 30582 1726855312.35247: stderr chunk (state=3): >>><<< 30582 1726855312.35277: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855312.35328: _low_level_execute_command(): starting 30582 1726855312.35332: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418 `" && echo ansible-tmp-1726855312.352899-32874-55387370311418="` echo /root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418 `" ) && sleep 0' 30582 1726855312.36029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855312.36060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855312.36084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855312.36171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855312.36223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855312.36239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855312.36261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855312.36551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855312.38459: stdout chunk (state=3): >>>ansible-tmp-1726855312.352899-32874-55387370311418=/root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418 <<< 30582 1726855312.38721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855312.38725: stdout chunk (state=3): >>><<< 30582 1726855312.38727: stderr chunk (state=3): >>><<< 30582 1726855312.39027: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855312.352899-32874-55387370311418=/root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855312.39030: variable 'ansible_module_compression' from source: unknown 30582 1726855312.39033: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855312.39035: variable 'ansible_facts' from source: unknown 30582 1726855312.39197: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/AnsiballZ_command.py 30582 1726855312.39381: Sending initial data 30582 1726855312.39481: Sent initial data (154 bytes) 30582 1726855312.40154: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855312.40169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855312.40244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855312.40434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855312.40509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855312.42146: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855312.42297: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855312.42402: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_flbyn29 /root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/AnsiballZ_command.py <<< 30582 1726855312.42406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/AnsiballZ_command.py" <<< 30582 1726855312.42585: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_flbyn29" to remote "/root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/AnsiballZ_command.py" <<< 30582 1726855312.44514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855312.44656: stderr chunk (state=3): >>><<< 30582 1726855312.44668: stdout chunk (state=3): >>><<< 30582 1726855312.45101: done transferring module to remote 30582 1726855312.45104: _low_level_execute_command(): starting 30582 1726855312.45107: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/ /root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/AnsiballZ_command.py && sleep 0' 30582 1726855312.46515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855312.46866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855312.47169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855312.47517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855312.49192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855312.49196: stdout chunk (state=3): >>><<< 30582 1726855312.49205: stderr chunk (state=3): >>><<< 30582 1726855312.49224: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855312.49237: _low_level_execute_command(): starting 30582 1726855312.49248: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/AnsiballZ_command.py && sleep 0' 30582 1726855312.51238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855312.51267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855312.51286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855312.51311: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855312.51326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855312.51402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855312.51444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855312.51579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855312.67255: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:01:52.668024", "end": "2024-09-20 14:01:52.671432", "delta": "0:00:00.003408", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855312.68879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855312.68884: stdout chunk (state=3): >>><<< 30582 1726855312.68886: stderr chunk (state=3): >>><<< 30582 1726855312.69017: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:01:52.668024", "end": "2024-09-20 14:01:52.671432", "delta": "0:00:00.003408", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855312.69202: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855312.69207: _low_level_execute_command(): starting 30582 1726855312.69209: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855312.352899-32874-55387370311418/ > /dev/null 2>&1 && sleep 0' 30582 1726855312.69802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855312.69893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855312.69896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855312.69994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855312.70118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855312.72096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855312.72100: stdout chunk (state=3): >>><<< 30582 1726855312.72103: stderr chunk (state=3): >>><<< 30582 1726855312.72105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855312.72112: handler run complete 30582 1726855312.72230: Evaluated conditional (False): False 30582 1726855312.72236: attempt loop complete, returning result 30582 1726855312.72238: _execute() done 30582 1726855312.72240: dumping result to json 30582 1726855312.72242: done dumping result, returning 30582 1726855312.72244: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcc66-ac2b-aa83-7d57-000000001069] 30582 1726855312.72246: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001069 30582 1726855312.72644: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001069 30582 1726855312.72648: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003408", "end": "2024-09-20 14:01:52.671432", "rc": 0, "start": "2024-09-20 14:01:52.668024" } STDOUT: bonding_masters eth0 lo rpltstbr 30582 1726855312.72733: no more pending results, returning what we have 30582 1726855312.72736: results queue empty 30582 1726855312.72737: checking for any_errors_fatal 30582 1726855312.72739: done checking for any_errors_fatal 30582 1726855312.72739: checking for max_fail_percentage 30582 1726855312.72741: done checking for max_fail_percentage 30582 1726855312.72742: checking to see if all hosts have failed and the running result is not ok 30582 1726855312.72743: done checking to see if all hosts have failed 30582 1726855312.72743: getting the remaining hosts for this loop 30582 1726855312.72744: done getting the remaining hosts for this loop 30582 1726855312.72748: getting the next task for host managed_node3 30582 1726855312.72754: done getting next task for host managed_node3 30582 1726855312.72765: ^ task is: TASK: Set current_interfaces 30582 1726855312.72770: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855312.72777: getting variables 30582 1726855312.72778: in VariableManager get_vars() 30582 1726855312.72810: Calling all_inventory to load vars for managed_node3 30582 1726855312.72813: Calling groups_inventory to load vars for managed_node3 30582 1726855312.72816: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855312.72826: Calling all_plugins_play to load vars for managed_node3 30582 1726855312.72829: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855312.73028: Calling groups_plugins_play to load vars for managed_node3 30582 1726855312.76077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855312.79046: done with get_vars() 30582 1726855312.79080: done getting variables 30582 1726855312.79145: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 14:01:52 -0400 (0:00:00.498) 0:00:49.141 ****** 30582 1726855312.79179: entering _queue_task() for managed_node3/set_fact 30582 1726855312.79559: worker is 1 (out of 1 available) 30582 1726855312.79573: exiting _queue_task() for managed_node3/set_fact 30582 1726855312.79585: done queuing things up, now waiting for results queue to drain 30582 1726855312.79586: waiting for pending results... 30582 1726855312.79891: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30582 1726855312.80094: in run() - task 0affcc66-ac2b-aa83-7d57-00000000106a 30582 1726855312.80099: variable 'ansible_search_path' from source: unknown 30582 1726855312.80101: variable 'ansible_search_path' from source: unknown 30582 1726855312.80104: calling self._execute() 30582 1726855312.80154: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.80160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.80171: variable 'omit' from source: magic vars 30582 1726855312.80764: variable 'ansible_distribution_major_version' from source: facts 30582 1726855312.80768: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855312.80771: variable 'omit' from source: magic vars 30582 1726855312.80777: variable 'omit' from source: magic vars 30582 1726855312.80780: variable '_current_interfaces' from source: set_fact 30582 1726855312.80816: variable 'omit' from source: magic vars 30582 1726855312.80854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855312.80889: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855312.80915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855312.80931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855312.80943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855312.80983: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855312.80986: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.80991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.81094: Set connection var ansible_timeout to 10 30582 1726855312.81097: Set connection var ansible_connection to ssh 30582 1726855312.81099: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855312.81102: Set connection var ansible_pipelining to False 30582 1726855312.81106: Set connection var ansible_shell_executable to /bin/sh 30582 1726855312.81113: Set connection var ansible_shell_type to sh 30582 1726855312.81198: variable 'ansible_shell_executable' from source: unknown 30582 1726855312.81202: variable 'ansible_connection' from source: unknown 30582 1726855312.81204: variable 'ansible_module_compression' from source: unknown 30582 1726855312.81207: variable 'ansible_shell_type' from source: unknown 30582 1726855312.81209: variable 'ansible_shell_executable' from source: unknown 30582 1726855312.81210: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.81212: variable 'ansible_pipelining' from source: unknown 30582 1726855312.81214: variable 'ansible_timeout' from source: unknown 30582 1726855312.81216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.81366: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855312.81369: variable 'omit' from source: magic vars 30582 1726855312.81372: starting attempt loop 30582 1726855312.81377: running the handler 30582 1726855312.81379: handler run complete 30582 1726855312.81381: attempt loop complete, returning result 30582 1726855312.81383: _execute() done 30582 1726855312.81385: dumping result to json 30582 1726855312.81389: done dumping result, returning 30582 1726855312.81392: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcc66-ac2b-aa83-7d57-00000000106a] 30582 1726855312.81394: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000106a ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30582 1726855312.81811: no more pending results, returning what we have 30582 1726855312.81814: results queue empty 30582 1726855312.81815: checking for any_errors_fatal 30582 1726855312.81822: done checking for any_errors_fatal 30582 1726855312.81823: checking for max_fail_percentage 30582 1726855312.81825: done checking for max_fail_percentage 30582 1726855312.81826: checking to see if all hosts have failed and the running result is not ok 30582 1726855312.81827: done checking to see if all hosts have failed 30582 1726855312.81828: getting the remaining hosts for this loop 30582 1726855312.81829: done getting the remaining hosts for this loop 30582 1726855312.81832: getting the next task for host managed_node3 30582 1726855312.81841: done getting next task for host managed_node3 30582 1726855312.81843: ^ task is: TASK: Show current_interfaces 30582 1726855312.81847: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855312.81851: getting variables 30582 1726855312.81852: in VariableManager get_vars() 30582 1726855312.81884: Calling all_inventory to load vars for managed_node3 30582 1726855312.81886: Calling groups_inventory to load vars for managed_node3 30582 1726855312.81891: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855312.81900: Calling all_plugins_play to load vars for managed_node3 30582 1726855312.81902: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855312.81905: Calling groups_plugins_play to load vars for managed_node3 30582 1726855312.82732: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000106a 30582 1726855312.82737: WORKER PROCESS EXITING 30582 1726855312.84928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855312.92229: done with get_vars() 30582 1726855312.92254: done getting variables 30582 1726855312.92297: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 14:01:52 -0400 (0:00:00.131) 0:00:49.273 ****** 30582 1726855312.92331: entering _queue_task() for managed_node3/debug 30582 1726855312.92720: worker is 1 (out of 1 available) 30582 1726855312.92735: exiting _queue_task() for managed_node3/debug 30582 1726855312.92746: done queuing things up, now waiting for results queue to drain 30582 1726855312.92748: waiting for pending results... 30582 1726855312.93095: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30582 1726855312.93221: in run() - task 0affcc66-ac2b-aa83-7d57-00000000102f 30582 1726855312.93238: variable 'ansible_search_path' from source: unknown 30582 1726855312.93242: variable 'ansible_search_path' from source: unknown 30582 1726855312.93284: calling self._execute() 30582 1726855312.93392: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.93397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.93407: variable 'omit' from source: magic vars 30582 1726855312.93805: variable 'ansible_distribution_major_version' from source: facts 30582 1726855312.93817: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855312.93823: variable 'omit' from source: magic vars 30582 1726855312.93922: variable 'omit' from source: magic vars 30582 1726855312.94040: variable 'current_interfaces' from source: set_fact 30582 1726855312.94071: variable 'omit' from source: magic vars 30582 1726855312.94119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855312.94155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855312.94213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855312.94217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855312.94222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855312.94292: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855312.94296: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.94299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.94369: Set connection var ansible_timeout to 10 30582 1726855312.94372: Set connection var ansible_connection to ssh 30582 1726855312.94385: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855312.94392: Set connection var ansible_pipelining to False 30582 1726855312.94397: Set connection var ansible_shell_executable to /bin/sh 30582 1726855312.94400: Set connection var ansible_shell_type to sh 30582 1726855312.94449: variable 'ansible_shell_executable' from source: unknown 30582 1726855312.94453: variable 'ansible_connection' from source: unknown 30582 1726855312.94456: variable 'ansible_module_compression' from source: unknown 30582 1726855312.94459: variable 'ansible_shell_type' from source: unknown 30582 1726855312.94461: variable 'ansible_shell_executable' from source: unknown 30582 1726855312.94464: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855312.94466: variable 'ansible_pipelining' from source: unknown 30582 1726855312.94468: variable 'ansible_timeout' from source: unknown 30582 1726855312.94470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855312.94859: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855312.94862: variable 'omit' from source: magic vars 30582 1726855312.94864: starting attempt loop 30582 1726855312.94866: running the handler 30582 1726855312.94867: handler run complete 30582 1726855312.94869: attempt loop complete, returning result 30582 1726855312.94871: _execute() done 30582 1726855312.94873: dumping result to json 30582 1726855312.94874: done dumping result, returning 30582 1726855312.94877: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcc66-ac2b-aa83-7d57-00000000102f] 30582 1726855312.94878: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000102f 30582 1726855312.94941: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000102f 30582 1726855312.94944: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30582 1726855312.95062: no more pending results, returning what we have 30582 1726855312.95065: results queue empty 30582 1726855312.95067: checking for any_errors_fatal 30582 1726855312.95079: done checking for any_errors_fatal 30582 1726855312.95080: checking for max_fail_percentage 30582 1726855312.95082: done checking for max_fail_percentage 30582 1726855312.95083: checking to see if all hosts have failed and the running result is not ok 30582 1726855312.95084: done checking to see if all hosts have failed 30582 1726855312.95085: getting the remaining hosts for this loop 30582 1726855312.95086: done getting the remaining hosts for this loop 30582 1726855312.95093: getting the next task for host managed_node3 30582 1726855312.95103: done getting next task for host managed_node3 30582 1726855312.95107: ^ task is: TASK: Setup 30582 1726855312.95110: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855312.95114: getting variables 30582 1726855312.95116: in VariableManager get_vars() 30582 1726855312.95155: Calling all_inventory to load vars for managed_node3 30582 1726855312.95158: Calling groups_inventory to load vars for managed_node3 30582 1726855312.95161: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855312.95177: Calling all_plugins_play to load vars for managed_node3 30582 1726855312.95181: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855312.95185: Calling groups_plugins_play to load vars for managed_node3 30582 1726855312.96717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855312.98448: done with get_vars() 30582 1726855312.98484: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 14:01:52 -0400 (0:00:00.064) 0:00:49.338 ****** 30582 1726855312.98817: entering _queue_task() for managed_node3/include_tasks 30582 1726855312.99803: worker is 1 (out of 1 available) 30582 1726855312.99818: exiting _queue_task() for managed_node3/include_tasks 30582 1726855312.99831: done queuing things up, now waiting for results queue to drain 30582 1726855312.99833: waiting for pending results... 30582 1726855313.00577: running TaskExecutor() for managed_node3/TASK: Setup 30582 1726855313.00599: in run() - task 0affcc66-ac2b-aa83-7d57-000000001008 30582 1726855313.00766: variable 'ansible_search_path' from source: unknown 30582 1726855313.00770: variable 'ansible_search_path' from source: unknown 30582 1726855313.00773: variable 'lsr_setup' from source: include params 30582 1726855313.01095: variable 'lsr_setup' from source: include params 30582 1726855313.01098: variable 'omit' from source: magic vars 30582 1726855313.01113: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.01121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.01131: variable 'omit' from source: magic vars 30582 1726855313.01372: variable 'ansible_distribution_major_version' from source: facts 30582 1726855313.01380: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855313.01389: variable 'item' from source: unknown 30582 1726855313.01456: variable 'item' from source: unknown 30582 1726855313.01491: variable 'item' from source: unknown 30582 1726855313.01550: variable 'item' from source: unknown 30582 1726855313.01673: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.01677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.01681: variable 'omit' from source: magic vars 30582 1726855313.02002: variable 'ansible_distribution_major_version' from source: facts 30582 1726855313.02065: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855313.02069: variable 'item' from source: unknown 30582 1726855313.02071: variable 'item' from source: unknown 30582 1726855313.02073: variable 'item' from source: unknown 30582 1726855313.02075: variable 'item' from source: unknown 30582 1726855313.02238: dumping result to json 30582 1726855313.02241: done dumping result, returning 30582 1726855313.02244: done running TaskExecutor() for managed_node3/TASK: Setup [0affcc66-ac2b-aa83-7d57-000000001008] 30582 1726855313.02247: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001008 30582 1726855313.02389: no more pending results, returning what we have 30582 1726855313.02394: in VariableManager get_vars() 30582 1726855313.02427: Calling all_inventory to load vars for managed_node3 30582 1726855313.02430: Calling groups_inventory to load vars for managed_node3 30582 1726855313.02432: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855313.02442: Calling all_plugins_play to load vars for managed_node3 30582 1726855313.02445: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855313.02448: Calling groups_plugins_play to load vars for managed_node3 30582 1726855313.03429: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001008 30582 1726855313.03434: WORKER PROCESS EXITING 30582 1726855313.06244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855313.09393: done with get_vars() 30582 1726855313.09424: variable 'ansible_search_path' from source: unknown 30582 1726855313.09425: variable 'ansible_search_path' from source: unknown 30582 1726855313.09467: variable 'ansible_search_path' from source: unknown 30582 1726855313.09469: variable 'ansible_search_path' from source: unknown 30582 1726855313.09503: we have included files to process 30582 1726855313.09505: generating all_blocks data 30582 1726855313.09506: done generating all_blocks data 30582 1726855313.09510: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855313.09511: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855313.09513: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855313.09777: done processing included file 30582 1726855313.09780: iterating over new_blocks loaded from include file 30582 1726855313.09781: in VariableManager get_vars() 30582 1726855313.09799: done with get_vars() 30582 1726855313.09801: filtering new block on tags 30582 1726855313.09836: done filtering new block on tags 30582 1726855313.09838: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30582 1726855313.09843: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855313.09844: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855313.09847: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855313.09945: done processing included file 30582 1726855313.09947: iterating over new_blocks loaded from include file 30582 1726855313.09948: in VariableManager get_vars() 30582 1726855313.09967: done with get_vars() 30582 1726855313.09969: filtering new block on tags 30582 1726855313.09994: done filtering new block on tags 30582 1726855313.09997: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node3 => (item=tasks/activate_profile.yml) 30582 1726855313.10000: extending task lists for all hosts with included blocks 30582 1726855313.11161: done extending task lists 30582 1726855313.11163: done processing included files 30582 1726855313.11164: results queue empty 30582 1726855313.11164: checking for any_errors_fatal 30582 1726855313.11168: done checking for any_errors_fatal 30582 1726855313.11169: checking for max_fail_percentage 30582 1726855313.11170: done checking for max_fail_percentage 30582 1726855313.11171: checking to see if all hosts have failed and the running result is not ok 30582 1726855313.11196: done checking to see if all hosts have failed 30582 1726855313.11197: getting the remaining hosts for this loop 30582 1726855313.11199: done getting the remaining hosts for this loop 30582 1726855313.11202: getting the next task for host managed_node3 30582 1726855313.11207: done getting next task for host managed_node3 30582 1726855313.11210: ^ task is: TASK: Include network role 30582 1726855313.11213: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855313.11216: getting variables 30582 1726855313.11217: in VariableManager get_vars() 30582 1726855313.11230: Calling all_inventory to load vars for managed_node3 30582 1726855313.11238: Calling groups_inventory to load vars for managed_node3 30582 1726855313.11241: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855313.11246: Calling all_plugins_play to load vars for managed_node3 30582 1726855313.11248: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855313.11251: Calling groups_plugins_play to load vars for managed_node3 30582 1726855313.13255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855313.15686: done with get_vars() 30582 1726855313.15758: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 14:01:53 -0400 (0:00:00.170) 0:00:49.508 ****** 30582 1726855313.15849: entering _queue_task() for managed_node3/include_role 30582 1726855313.16244: worker is 1 (out of 1 available) 30582 1726855313.16258: exiting _queue_task() for managed_node3/include_role 30582 1726855313.16390: done queuing things up, now waiting for results queue to drain 30582 1726855313.16393: waiting for pending results... 30582 1726855313.16598: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855313.16752: in run() - task 0affcc66-ac2b-aa83-7d57-00000000108f 30582 1726855313.16778: variable 'ansible_search_path' from source: unknown 30582 1726855313.16790: variable 'ansible_search_path' from source: unknown 30582 1726855313.16836: calling self._execute() 30582 1726855313.16935: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.16950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.17043: variable 'omit' from source: magic vars 30582 1726855313.17518: variable 'ansible_distribution_major_version' from source: facts 30582 1726855313.17537: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855313.17549: _execute() done 30582 1726855313.17558: dumping result to json 30582 1726855313.17567: done dumping result, returning 30582 1726855313.17589: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-00000000108f] 30582 1726855313.17602: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000108f 30582 1726855313.17897: no more pending results, returning what we have 30582 1726855313.17903: in VariableManager get_vars() 30582 1726855313.17962: Calling all_inventory to load vars for managed_node3 30582 1726855313.17966: Calling groups_inventory to load vars for managed_node3 30582 1726855313.17970: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855313.17989: Calling all_plugins_play to load vars for managed_node3 30582 1726855313.17994: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855313.17998: Calling groups_plugins_play to load vars for managed_node3 30582 1726855313.18530: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000108f 30582 1726855313.18533: WORKER PROCESS EXITING 30582 1726855313.20237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855313.23080: done with get_vars() 30582 1726855313.23118: variable 'ansible_search_path' from source: unknown 30582 1726855313.23120: variable 'ansible_search_path' from source: unknown 30582 1726855313.23347: variable 'omit' from source: magic vars 30582 1726855313.23404: variable 'omit' from source: magic vars 30582 1726855313.23420: variable 'omit' from source: magic vars 30582 1726855313.23425: we have included files to process 30582 1726855313.23426: generating all_blocks data 30582 1726855313.23427: done generating all_blocks data 30582 1726855313.23429: processing included file: fedora.linux_system_roles.network 30582 1726855313.23450: in VariableManager get_vars() 30582 1726855313.23472: done with get_vars() 30582 1726855313.23511: in VariableManager get_vars() 30582 1726855313.23531: done with get_vars() 30582 1726855313.23572: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855313.23706: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855313.23800: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855313.24467: in VariableManager get_vars() 30582 1726855313.24497: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855313.27654: iterating over new_blocks loaded from include file 30582 1726855313.27657: in VariableManager get_vars() 30582 1726855313.27692: done with get_vars() 30582 1726855313.27695: filtering new block on tags 30582 1726855313.28000: done filtering new block on tags 30582 1726855313.28005: in VariableManager get_vars() 30582 1726855313.28020: done with get_vars() 30582 1726855313.28022: filtering new block on tags 30582 1726855313.28046: done filtering new block on tags 30582 1726855313.28049: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855313.28054: extending task lists for all hosts with included blocks 30582 1726855313.28227: done extending task lists 30582 1726855313.28229: done processing included files 30582 1726855313.28230: results queue empty 30582 1726855313.28230: checking for any_errors_fatal 30582 1726855313.28234: done checking for any_errors_fatal 30582 1726855313.28235: checking for max_fail_percentage 30582 1726855313.28236: done checking for max_fail_percentage 30582 1726855313.28237: checking to see if all hosts have failed and the running result is not ok 30582 1726855313.28237: done checking to see if all hosts have failed 30582 1726855313.28238: getting the remaining hosts for this loop 30582 1726855313.28239: done getting the remaining hosts for this loop 30582 1726855313.28242: getting the next task for host managed_node3 30582 1726855313.28252: done getting next task for host managed_node3 30582 1726855313.28259: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855313.28262: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855313.28274: getting variables 30582 1726855313.28275: in VariableManager get_vars() 30582 1726855313.28291: Calling all_inventory to load vars for managed_node3 30582 1726855313.28294: Calling groups_inventory to load vars for managed_node3 30582 1726855313.28296: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855313.28302: Calling all_plugins_play to load vars for managed_node3 30582 1726855313.28304: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855313.28307: Calling groups_plugins_play to load vars for managed_node3 30582 1726855313.29873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855313.32516: done with get_vars() 30582 1726855313.32551: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:01:53 -0400 (0:00:00.168) 0:00:49.676 ****** 30582 1726855313.32663: entering _queue_task() for managed_node3/include_tasks 30582 1726855313.33109: worker is 1 (out of 1 available) 30582 1726855313.33121: exiting _queue_task() for managed_node3/include_tasks 30582 1726855313.33249: done queuing things up, now waiting for results queue to drain 30582 1726855313.33251: waiting for pending results... 30582 1726855313.33488: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855313.33674: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010f5 30582 1726855313.33745: variable 'ansible_search_path' from source: unknown 30582 1726855313.33990: variable 'ansible_search_path' from source: unknown 30582 1726855313.33995: calling self._execute() 30582 1726855313.33999: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.34003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.34007: variable 'omit' from source: magic vars 30582 1726855313.34616: variable 'ansible_distribution_major_version' from source: facts 30582 1726855313.34630: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855313.34692: _execute() done 30582 1726855313.34696: dumping result to json 30582 1726855313.34698: done dumping result, returning 30582 1726855313.34701: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-0000000010f5] 30582 1726855313.34703: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f5 30582 1726855313.34891: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f5 30582 1726855313.34895: WORKER PROCESS EXITING 30582 1726855313.34955: no more pending results, returning what we have 30582 1726855313.34963: in VariableManager get_vars() 30582 1726855313.35130: Calling all_inventory to load vars for managed_node3 30582 1726855313.35136: Calling groups_inventory to load vars for managed_node3 30582 1726855313.35140: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855313.35153: Calling all_plugins_play to load vars for managed_node3 30582 1726855313.35157: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855313.35161: Calling groups_plugins_play to load vars for managed_node3 30582 1726855313.37831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855313.39867: done with get_vars() 30582 1726855313.39894: variable 'ansible_search_path' from source: unknown 30582 1726855313.39896: variable 'ansible_search_path' from source: unknown 30582 1726855313.39937: we have included files to process 30582 1726855313.39939: generating all_blocks data 30582 1726855313.39941: done generating all_blocks data 30582 1726855313.39945: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855313.39946: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855313.39949: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855313.40633: done processing included file 30582 1726855313.40635: iterating over new_blocks loaded from include file 30582 1726855313.40636: in VariableManager get_vars() 30582 1726855313.40663: done with get_vars() 30582 1726855313.40665: filtering new block on tags 30582 1726855313.40706: done filtering new block on tags 30582 1726855313.40710: in VariableManager get_vars() 30582 1726855313.40742: done with get_vars() 30582 1726855313.40744: filtering new block on tags 30582 1726855313.40800: done filtering new block on tags 30582 1726855313.40803: in VariableManager get_vars() 30582 1726855313.40831: done with get_vars() 30582 1726855313.40833: filtering new block on tags 30582 1726855313.40886: done filtering new block on tags 30582 1726855313.40897: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855313.40902: extending task lists for all hosts with included blocks 30582 1726855313.44082: done extending task lists 30582 1726855313.44169: done processing included files 30582 1726855313.44171: results queue empty 30582 1726855313.44172: checking for any_errors_fatal 30582 1726855313.44177: done checking for any_errors_fatal 30582 1726855313.44178: checking for max_fail_percentage 30582 1726855313.44180: done checking for max_fail_percentage 30582 1726855313.44183: checking to see if all hosts have failed and the running result is not ok 30582 1726855313.44184: done checking to see if all hosts have failed 30582 1726855313.44184: getting the remaining hosts for this loop 30582 1726855313.44186: done getting the remaining hosts for this loop 30582 1726855313.44211: getting the next task for host managed_node3 30582 1726855313.44221: done getting next task for host managed_node3 30582 1726855313.44225: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855313.44229: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855313.44242: getting variables 30582 1726855313.44243: in VariableManager get_vars() 30582 1726855313.44262: Calling all_inventory to load vars for managed_node3 30582 1726855313.44268: Calling groups_inventory to load vars for managed_node3 30582 1726855313.44271: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855313.44278: Calling all_plugins_play to load vars for managed_node3 30582 1726855313.44280: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855313.44283: Calling groups_plugins_play to load vars for managed_node3 30582 1726855313.45666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855313.47911: done with get_vars() 30582 1726855313.47946: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:01:53 -0400 (0:00:00.153) 0:00:49.830 ****** 30582 1726855313.48053: entering _queue_task() for managed_node3/setup 30582 1726855313.48614: worker is 1 (out of 1 available) 30582 1726855313.48627: exiting _queue_task() for managed_node3/setup 30582 1726855313.48640: done queuing things up, now waiting for results queue to drain 30582 1726855313.48643: waiting for pending results... 30582 1726855313.49155: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855313.49653: in run() - task 0affcc66-ac2b-aa83-7d57-000000001152 30582 1726855313.49657: variable 'ansible_search_path' from source: unknown 30582 1726855313.49660: variable 'ansible_search_path' from source: unknown 30582 1726855313.49663: calling self._execute() 30582 1726855313.49916: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.49920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.49930: variable 'omit' from source: magic vars 30582 1726855313.50977: variable 'ansible_distribution_major_version' from source: facts 30582 1726855313.51105: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855313.51512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855313.56870: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855313.57192: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855313.57196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855313.57199: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855313.57210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855313.57433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855313.57457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855313.57486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855313.57528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855313.57543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855313.57864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855313.57868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855313.57870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855313.57892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855313.57906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855313.58460: variable '__network_required_facts' from source: role '' defaults 30582 1726855313.58470: variable 'ansible_facts' from source: unknown 30582 1726855313.60783: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855313.60830: when evaluation is False, skipping this task 30582 1726855313.60897: _execute() done 30582 1726855313.60909: dumping result to json 30582 1726855313.61076: done dumping result, returning 30582 1726855313.61081: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-000000001152] 30582 1726855313.61084: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001152 30582 1726855313.61162: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001152 30582 1726855313.61166: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855313.61228: no more pending results, returning what we have 30582 1726855313.61232: results queue empty 30582 1726855313.61234: checking for any_errors_fatal 30582 1726855313.61235: done checking for any_errors_fatal 30582 1726855313.61236: checking for max_fail_percentage 30582 1726855313.61238: done checking for max_fail_percentage 30582 1726855313.61238: checking to see if all hosts have failed and the running result is not ok 30582 1726855313.61239: done checking to see if all hosts have failed 30582 1726855313.61240: getting the remaining hosts for this loop 30582 1726855313.61241: done getting the remaining hosts for this loop 30582 1726855313.61245: getting the next task for host managed_node3 30582 1726855313.61256: done getting next task for host managed_node3 30582 1726855313.61259: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855313.61266: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855313.61291: getting variables 30582 1726855313.61293: in VariableManager get_vars() 30582 1726855313.61333: Calling all_inventory to load vars for managed_node3 30582 1726855313.61335: Calling groups_inventory to load vars for managed_node3 30582 1726855313.61337: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855313.61348: Calling all_plugins_play to load vars for managed_node3 30582 1726855313.61351: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855313.61358: Calling groups_plugins_play to load vars for managed_node3 30582 1726855313.63761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855313.67635: done with get_vars() 30582 1726855313.67675: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:01:53 -0400 (0:00:00.198) 0:00:50.028 ****** 30582 1726855313.67864: entering _queue_task() for managed_node3/stat 30582 1726855313.68249: worker is 1 (out of 1 available) 30582 1726855313.68265: exiting _queue_task() for managed_node3/stat 30582 1726855313.68281: done queuing things up, now waiting for results queue to drain 30582 1726855313.68283: waiting for pending results... 30582 1726855313.68675: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855313.68748: in run() - task 0affcc66-ac2b-aa83-7d57-000000001154 30582 1726855313.68770: variable 'ansible_search_path' from source: unknown 30582 1726855313.68776: variable 'ansible_search_path' from source: unknown 30582 1726855313.68823: calling self._execute() 30582 1726855313.68933: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.68937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.68940: variable 'omit' from source: magic vars 30582 1726855313.69310: variable 'ansible_distribution_major_version' from source: facts 30582 1726855313.69324: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855313.69500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855313.69789: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855313.69839: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855313.69981: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855313.69984: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855313.70001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855313.70031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855313.70057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855313.70082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855313.70208: variable '__network_is_ostree' from source: set_fact 30582 1726855313.70211: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855313.70214: when evaluation is False, skipping this task 30582 1726855313.70216: _execute() done 30582 1726855313.70218: dumping result to json 30582 1726855313.70220: done dumping result, returning 30582 1726855313.70222: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-000000001154] 30582 1726855313.70225: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001154 30582 1726855313.70405: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001154 30582 1726855313.70408: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855313.70469: no more pending results, returning what we have 30582 1726855313.70475: results queue empty 30582 1726855313.70476: checking for any_errors_fatal 30582 1726855313.70489: done checking for any_errors_fatal 30582 1726855313.70490: checking for max_fail_percentage 30582 1726855313.70492: done checking for max_fail_percentage 30582 1726855313.70493: checking to see if all hosts have failed and the running result is not ok 30582 1726855313.70496: done checking to see if all hosts have failed 30582 1726855313.70497: getting the remaining hosts for this loop 30582 1726855313.70499: done getting the remaining hosts for this loop 30582 1726855313.70502: getting the next task for host managed_node3 30582 1726855313.70509: done getting next task for host managed_node3 30582 1726855313.70513: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855313.70521: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855313.70544: getting variables 30582 1726855313.70546: in VariableManager get_vars() 30582 1726855313.70709: Calling all_inventory to load vars for managed_node3 30582 1726855313.70729: Calling groups_inventory to load vars for managed_node3 30582 1726855313.70737: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855313.70746: Calling all_plugins_play to load vars for managed_node3 30582 1726855313.70749: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855313.70753: Calling groups_plugins_play to load vars for managed_node3 30582 1726855313.73029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855313.74830: done with get_vars() 30582 1726855313.74864: done getting variables 30582 1726855313.74933: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:01:53 -0400 (0:00:00.071) 0:00:50.099 ****** 30582 1726855313.74971: entering _queue_task() for managed_node3/set_fact 30582 1726855313.75478: worker is 1 (out of 1 available) 30582 1726855313.75493: exiting _queue_task() for managed_node3/set_fact 30582 1726855313.75503: done queuing things up, now waiting for results queue to drain 30582 1726855313.75505: waiting for pending results... 30582 1726855313.75938: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855313.75946: in run() - task 0affcc66-ac2b-aa83-7d57-000000001155 30582 1726855313.75950: variable 'ansible_search_path' from source: unknown 30582 1726855313.75954: variable 'ansible_search_path' from source: unknown 30582 1726855313.75957: calling self._execute() 30582 1726855313.76051: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.76056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.76065: variable 'omit' from source: magic vars 30582 1726855313.76486: variable 'ansible_distribution_major_version' from source: facts 30582 1726855313.76531: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855313.76684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855313.77001: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855313.77042: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855313.77091: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855313.77127: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855313.77219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855313.77304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855313.77308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855313.77311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855313.77414: variable '__network_is_ostree' from source: set_fact 30582 1726855313.77422: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855313.77425: when evaluation is False, skipping this task 30582 1726855313.77427: _execute() done 30582 1726855313.77430: dumping result to json 30582 1726855313.77515: done dumping result, returning 30582 1726855313.77518: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-000000001155] 30582 1726855313.77520: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001155 30582 1726855313.77736: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001155 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855313.77791: no more pending results, returning what we have 30582 1726855313.77795: results queue empty 30582 1726855313.77797: checking for any_errors_fatal 30582 1726855313.77801: done checking for any_errors_fatal 30582 1726855313.77802: checking for max_fail_percentage 30582 1726855313.77804: done checking for max_fail_percentage 30582 1726855313.77805: checking to see if all hosts have failed and the running result is not ok 30582 1726855313.77805: done checking to see if all hosts have failed 30582 1726855313.77806: getting the remaining hosts for this loop 30582 1726855313.77807: done getting the remaining hosts for this loop 30582 1726855313.77811: getting the next task for host managed_node3 30582 1726855313.77822: done getting next task for host managed_node3 30582 1726855313.77826: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855313.77832: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855313.77848: WORKER PROCESS EXITING 30582 1726855313.77990: getting variables 30582 1726855313.77992: in VariableManager get_vars() 30582 1726855313.78028: Calling all_inventory to load vars for managed_node3 30582 1726855313.78032: Calling groups_inventory to load vars for managed_node3 30582 1726855313.78034: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855313.78043: Calling all_plugins_play to load vars for managed_node3 30582 1726855313.78046: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855313.78051: Calling groups_plugins_play to load vars for managed_node3 30582 1726855313.79912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855313.81602: done with get_vars() 30582 1726855313.81626: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:01:53 -0400 (0:00:00.068) 0:00:50.168 ****** 30582 1726855313.81844: entering _queue_task() for managed_node3/service_facts 30582 1726855313.82781: worker is 1 (out of 1 available) 30582 1726855313.82794: exiting _queue_task() for managed_node3/service_facts 30582 1726855313.82806: done queuing things up, now waiting for results queue to drain 30582 1726855313.82808: waiting for pending results... 30582 1726855313.83085: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855313.83255: in run() - task 0affcc66-ac2b-aa83-7d57-000000001157 30582 1726855313.83268: variable 'ansible_search_path' from source: unknown 30582 1726855313.83272: variable 'ansible_search_path' from source: unknown 30582 1726855313.83353: calling self._execute() 30582 1726855313.83421: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.83425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.83462: variable 'omit' from source: magic vars 30582 1726855313.83854: variable 'ansible_distribution_major_version' from source: facts 30582 1726855313.83867: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855313.83870: variable 'omit' from source: magic vars 30582 1726855313.83978: variable 'omit' from source: magic vars 30582 1726855313.84010: variable 'omit' from source: magic vars 30582 1726855313.84035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855313.84083: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855313.84096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855313.84118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855313.84192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855313.84196: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855313.84199: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.84201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.84290: Set connection var ansible_timeout to 10 30582 1726855313.84294: Set connection var ansible_connection to ssh 30582 1726855313.84301: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855313.84306: Set connection var ansible_pipelining to False 30582 1726855313.84316: Set connection var ansible_shell_executable to /bin/sh 30582 1726855313.84318: Set connection var ansible_shell_type to sh 30582 1726855313.84424: variable 'ansible_shell_executable' from source: unknown 30582 1726855313.84427: variable 'ansible_connection' from source: unknown 30582 1726855313.84430: variable 'ansible_module_compression' from source: unknown 30582 1726855313.84432: variable 'ansible_shell_type' from source: unknown 30582 1726855313.84434: variable 'ansible_shell_executable' from source: unknown 30582 1726855313.84438: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855313.84441: variable 'ansible_pipelining' from source: unknown 30582 1726855313.84443: variable 'ansible_timeout' from source: unknown 30582 1726855313.84446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855313.84612: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855313.84617: variable 'omit' from source: magic vars 30582 1726855313.84619: starting attempt loop 30582 1726855313.84621: running the handler 30582 1726855313.84624: _low_level_execute_command(): starting 30582 1726855313.84626: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855313.85642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855313.85646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855313.85648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855313.85801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855313.85927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855313.88109: stdout chunk (state=3): >>>/root <<< 30582 1726855313.88113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855313.88116: stdout chunk (state=3): >>><<< 30582 1726855313.88118: stderr chunk (state=3): >>><<< 30582 1726855313.88120: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855313.88122: _low_level_execute_command(): starting 30582 1726855313.88125: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715 `" && echo ansible-tmp-1726855313.8800888-32966-26381879449715="` echo /root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715 `" ) && sleep 0' 30582 1726855313.88811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855313.88935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855313.88981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855313.91041: stdout chunk (state=3): >>>ansible-tmp-1726855313.8800888-32966-26381879449715=/root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715 <<< 30582 1726855313.91132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855313.91202: stderr chunk (state=3): >>><<< 30582 1726855313.91225: stdout chunk (state=3): >>><<< 30582 1726855313.91281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855313.8800888-32966-26381879449715=/root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855313.91493: variable 'ansible_module_compression' from source: unknown 30582 1726855313.91496: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855313.91792: variable 'ansible_facts' from source: unknown 30582 1726855313.91795: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/AnsiballZ_service_facts.py 30582 1726855313.92138: Sending initial data 30582 1726855313.92141: Sent initial data (161 bytes) 30582 1726855313.92915: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855313.92919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855313.92921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855313.93203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855313.94626: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855313.94684: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855313.94861: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpn56mi_51 /root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/AnsiballZ_service_facts.py <<< 30582 1726855313.94873: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/AnsiballZ_service_facts.py" <<< 30582 1726855313.94952: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpn56mi_51" to remote "/root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/AnsiballZ_service_facts.py" <<< 30582 1726855313.96934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855313.97007: stderr chunk (state=3): >>><<< 30582 1726855313.97015: stdout chunk (state=3): >>><<< 30582 1726855313.97073: done transferring module to remote 30582 1726855313.97282: _low_level_execute_command(): starting 30582 1726855313.97286: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/ /root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/AnsiballZ_service_facts.py && sleep 0' 30582 1726855313.98515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855313.98519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855313.98521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855313.98524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855313.98526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855313.98527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855313.98707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855313.98710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855313.98713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855313.98800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855314.00815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855314.00819: stdout chunk (state=3): >>><<< 30582 1726855314.00822: stderr chunk (state=3): >>><<< 30582 1726855314.00975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855314.00981: _low_level_execute_command(): starting 30582 1726855314.00984: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/AnsiballZ_service_facts.py && sleep 0' 30582 1726855314.02041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855314.02296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855314.02350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855314.02485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855315.54149: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855315.55630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855315.55659: stderr chunk (state=3): >>><<< 30582 1726855315.55663: stdout chunk (state=3): >>><<< 30582 1726855315.55696: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855315.56293: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855315.56297: _low_level_execute_command(): starting 30582 1726855315.56299: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855313.8800888-32966-26381879449715/ > /dev/null 2>&1 && sleep 0' 30582 1726855315.57024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855315.57028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855315.57030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855315.57033: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855315.57098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855315.57123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855315.57196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855315.59057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855315.59090: stderr chunk (state=3): >>><<< 30582 1726855315.59094: stdout chunk (state=3): >>><<< 30582 1726855315.59105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855315.59111: handler run complete 30582 1726855315.59228: variable 'ansible_facts' from source: unknown 30582 1726855315.59328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855315.59670: variable 'ansible_facts' from source: unknown 30582 1726855315.59824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855315.59993: attempt loop complete, returning result 30582 1726855315.59996: _execute() done 30582 1726855315.59998: dumping result to json 30582 1726855315.60021: done dumping result, returning 30582 1726855315.60024: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000001157] 30582 1726855315.60030: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001157 30582 1726855315.61022: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001157 30582 1726855315.61025: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855315.61106: no more pending results, returning what we have 30582 1726855315.61109: results queue empty 30582 1726855315.61110: checking for any_errors_fatal 30582 1726855315.61114: done checking for any_errors_fatal 30582 1726855315.61114: checking for max_fail_percentage 30582 1726855315.61116: done checking for max_fail_percentage 30582 1726855315.61117: checking to see if all hosts have failed and the running result is not ok 30582 1726855315.61117: done checking to see if all hosts have failed 30582 1726855315.61118: getting the remaining hosts for this loop 30582 1726855315.61119: done getting the remaining hosts for this loop 30582 1726855315.61122: getting the next task for host managed_node3 30582 1726855315.61128: done getting next task for host managed_node3 30582 1726855315.61131: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855315.61141: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855315.61151: getting variables 30582 1726855315.61152: in VariableManager get_vars() 30582 1726855315.61179: Calling all_inventory to load vars for managed_node3 30582 1726855315.61181: Calling groups_inventory to load vars for managed_node3 30582 1726855315.61182: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855315.61191: Calling all_plugins_play to load vars for managed_node3 30582 1726855315.61193: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855315.61195: Calling groups_plugins_play to load vars for managed_node3 30582 1726855315.62068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855315.63725: done with get_vars() 30582 1726855315.63763: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:01:55 -0400 (0:00:01.820) 0:00:51.988 ****** 30582 1726855315.63868: entering _queue_task() for managed_node3/package_facts 30582 1726855315.64179: worker is 1 (out of 1 available) 30582 1726855315.64196: exiting _queue_task() for managed_node3/package_facts 30582 1726855315.64210: done queuing things up, now waiting for results queue to drain 30582 1726855315.64212: waiting for pending results... 30582 1726855315.64624: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855315.64708: in run() - task 0affcc66-ac2b-aa83-7d57-000000001158 30582 1726855315.64735: variable 'ansible_search_path' from source: unknown 30582 1726855315.64740: variable 'ansible_search_path' from source: unknown 30582 1726855315.64786: calling self._execute() 30582 1726855315.64961: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855315.64966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855315.64969: variable 'omit' from source: magic vars 30582 1726855315.67382: variable 'ansible_distribution_major_version' from source: facts 30582 1726855315.67437: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855315.67441: variable 'omit' from source: magic vars 30582 1726855315.67840: variable 'omit' from source: magic vars 30582 1726855315.67844: variable 'omit' from source: magic vars 30582 1726855315.67878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855315.68231: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855315.68252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855315.68275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855315.68284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855315.68322: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855315.68326: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855315.68328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855315.68536: Set connection var ansible_timeout to 10 30582 1726855315.68542: Set connection var ansible_connection to ssh 30582 1726855315.68552: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855315.68559: Set connection var ansible_pipelining to False 30582 1726855315.68566: Set connection var ansible_shell_executable to /bin/sh 30582 1726855315.68574: Set connection var ansible_shell_type to sh 30582 1726855315.68608: variable 'ansible_shell_executable' from source: unknown 30582 1726855315.68795: variable 'ansible_connection' from source: unknown 30582 1726855315.68799: variable 'ansible_module_compression' from source: unknown 30582 1726855315.68801: variable 'ansible_shell_type' from source: unknown 30582 1726855315.68803: variable 'ansible_shell_executable' from source: unknown 30582 1726855315.68805: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855315.68807: variable 'ansible_pipelining' from source: unknown 30582 1726855315.68809: variable 'ansible_timeout' from source: unknown 30582 1726855315.68811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855315.69303: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855315.69413: variable 'omit' from source: magic vars 30582 1726855315.69416: starting attempt loop 30582 1726855315.69419: running the handler 30582 1726855315.69421: _low_level_execute_command(): starting 30582 1726855315.69423: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855315.71165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855315.71302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855315.71306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855315.71308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855315.71615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855315.71735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855315.71865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855315.73559: stdout chunk (state=3): >>>/root <<< 30582 1726855315.73695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855315.73719: stderr chunk (state=3): >>><<< 30582 1726855315.73726: stdout chunk (state=3): >>><<< 30582 1726855315.73758: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855315.73772: _low_level_execute_command(): starting 30582 1726855315.73895: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889 `" && echo ansible-tmp-1726855315.7375736-33045-84481300644889="` echo /root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889 `" ) && sleep 0' 30582 1726855315.74510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855315.74741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855315.74745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855315.74748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855315.74815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855315.76714: stdout chunk (state=3): >>>ansible-tmp-1726855315.7375736-33045-84481300644889=/root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889 <<< 30582 1726855315.76882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855315.76886: stdout chunk (state=3): >>><<< 30582 1726855315.76890: stderr chunk (state=3): >>><<< 30582 1726855315.77116: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855315.7375736-33045-84481300644889=/root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855315.77120: variable 'ansible_module_compression' from source: unknown 30582 1726855315.77123: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855315.77312: variable 'ansible_facts' from source: unknown 30582 1726855315.77752: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/AnsiballZ_package_facts.py 30582 1726855315.78307: Sending initial data 30582 1726855315.78357: Sent initial data (161 bytes) 30582 1726855315.79110: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855315.79178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855315.79198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855315.79227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855315.79316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855315.80958: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855315.81161: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855315.81223: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpqvob_ivj /root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/AnsiballZ_package_facts.py <<< 30582 1726855315.81226: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/AnsiballZ_package_facts.py" <<< 30582 1726855315.81305: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpqvob_ivj" to remote "/root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/AnsiballZ_package_facts.py" <<< 30582 1726855315.83349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855315.83353: stderr chunk (state=3): >>><<< 30582 1726855315.83355: stdout chunk (state=3): >>><<< 30582 1726855315.83358: done transferring module to remote 30582 1726855315.83469: _low_level_execute_command(): starting 30582 1726855315.83485: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/ /root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/AnsiballZ_package_facts.py && sleep 0' 30582 1726855315.84311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855315.84327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855315.84342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855315.84368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855315.84402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855315.84417: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855315.84481: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855315.84527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855315.84547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855315.84585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855315.84681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855315.86898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855315.86902: stdout chunk (state=3): >>><<< 30582 1726855315.86905: stderr chunk (state=3): >>><<< 30582 1726855315.86907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855315.86910: _low_level_execute_command(): starting 30582 1726855315.86912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/AnsiballZ_package_facts.py && sleep 0' 30582 1726855315.88303: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855315.88355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855315.88375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855315.88473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855316.32392: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30582 1726855316.32694: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855316.34411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855316.34415: stdout chunk (state=3): >>><<< 30582 1726855316.34453: stderr chunk (state=3): >>><<< 30582 1726855316.34463: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855316.37936: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855316.37965: _low_level_execute_command(): starting 30582 1726855316.37994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855315.7375736-33045-84481300644889/ > /dev/null 2>&1 && sleep 0' 30582 1726855316.38748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855316.38766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855316.38905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855316.38938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855316.38960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855316.38981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855316.39113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855316.41048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855316.41052: stdout chunk (state=3): >>><<< 30582 1726855316.41054: stderr chunk (state=3): >>><<< 30582 1726855316.41077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855316.41154: handler run complete 30582 1726855316.41980: variable 'ansible_facts' from source: unknown 30582 1726855316.42443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855316.43563: variable 'ansible_facts' from source: unknown 30582 1726855316.44049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855316.44662: attempt loop complete, returning result 30582 1726855316.44666: _execute() done 30582 1726855316.44668: dumping result to json 30582 1726855316.44915: done dumping result, returning 30582 1726855316.44918: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000001158] 30582 1726855316.44920: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001158 30582 1726855316.47845: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001158 30582 1726855316.47849: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855316.48097: no more pending results, returning what we have 30582 1726855316.48100: results queue empty 30582 1726855316.48102: checking for any_errors_fatal 30582 1726855316.48108: done checking for any_errors_fatal 30582 1726855316.48108: checking for max_fail_percentage 30582 1726855316.48115: done checking for max_fail_percentage 30582 1726855316.48117: checking to see if all hosts have failed and the running result is not ok 30582 1726855316.48117: done checking to see if all hosts have failed 30582 1726855316.48118: getting the remaining hosts for this loop 30582 1726855316.48119: done getting the remaining hosts for this loop 30582 1726855316.48123: getting the next task for host managed_node3 30582 1726855316.48131: done getting next task for host managed_node3 30582 1726855316.48135: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855316.48141: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855316.48250: getting variables 30582 1726855316.48252: in VariableManager get_vars() 30582 1726855316.48297: Calling all_inventory to load vars for managed_node3 30582 1726855316.48301: Calling groups_inventory to load vars for managed_node3 30582 1726855316.48303: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855316.48313: Calling all_plugins_play to load vars for managed_node3 30582 1726855316.48316: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855316.48319: Calling groups_plugins_play to load vars for managed_node3 30582 1726855316.49919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855316.52961: done with get_vars() 30582 1726855316.53053: done getting variables 30582 1726855316.53256: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:01:56 -0400 (0:00:00.894) 0:00:52.882 ****** 30582 1726855316.53305: entering _queue_task() for managed_node3/debug 30582 1726855316.53876: worker is 1 (out of 1 available) 30582 1726855316.53891: exiting _queue_task() for managed_node3/debug 30582 1726855316.53906: done queuing things up, now waiting for results queue to drain 30582 1726855316.53907: waiting for pending results... 30582 1726855316.54183: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855316.54391: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010f6 30582 1726855316.54396: variable 'ansible_search_path' from source: unknown 30582 1726855316.54399: variable 'ansible_search_path' from source: unknown 30582 1726855316.54402: calling self._execute() 30582 1726855316.54448: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855316.54452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855316.54461: variable 'omit' from source: magic vars 30582 1726855316.54882: variable 'ansible_distribution_major_version' from source: facts 30582 1726855316.54896: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855316.54904: variable 'omit' from source: magic vars 30582 1726855316.55054: variable 'omit' from source: magic vars 30582 1726855316.55082: variable 'network_provider' from source: set_fact 30582 1726855316.55103: variable 'omit' from source: magic vars 30582 1726855316.55144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855316.55180: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855316.55215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855316.55218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855316.55230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855316.55262: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855316.55267: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855316.55269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855316.55386: Set connection var ansible_timeout to 10 30582 1726855316.55392: Set connection var ansible_connection to ssh 30582 1726855316.55394: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855316.55397: Set connection var ansible_pipelining to False 30582 1726855316.55496: Set connection var ansible_shell_executable to /bin/sh 30582 1726855316.55500: Set connection var ansible_shell_type to sh 30582 1726855316.55502: variable 'ansible_shell_executable' from source: unknown 30582 1726855316.55505: variable 'ansible_connection' from source: unknown 30582 1726855316.55508: variable 'ansible_module_compression' from source: unknown 30582 1726855316.55510: variable 'ansible_shell_type' from source: unknown 30582 1726855316.55512: variable 'ansible_shell_executable' from source: unknown 30582 1726855316.55514: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855316.55517: variable 'ansible_pipelining' from source: unknown 30582 1726855316.55519: variable 'ansible_timeout' from source: unknown 30582 1726855316.55521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855316.55605: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855316.55608: variable 'omit' from source: magic vars 30582 1726855316.55611: starting attempt loop 30582 1726855316.55613: running the handler 30582 1726855316.55711: handler run complete 30582 1726855316.55714: attempt loop complete, returning result 30582 1726855316.55717: _execute() done 30582 1726855316.55719: dumping result to json 30582 1726855316.55722: done dumping result, returning 30582 1726855316.55724: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-0000000010f6] 30582 1726855316.55726: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f6 30582 1726855316.55796: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f6 30582 1726855316.55799: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855316.56089: no more pending results, returning what we have 30582 1726855316.56093: results queue empty 30582 1726855316.56094: checking for any_errors_fatal 30582 1726855316.56101: done checking for any_errors_fatal 30582 1726855316.56101: checking for max_fail_percentage 30582 1726855316.56103: done checking for max_fail_percentage 30582 1726855316.56104: checking to see if all hosts have failed and the running result is not ok 30582 1726855316.56105: done checking to see if all hosts have failed 30582 1726855316.56105: getting the remaining hosts for this loop 30582 1726855316.56107: done getting the remaining hosts for this loop 30582 1726855316.56110: getting the next task for host managed_node3 30582 1726855316.56117: done getting next task for host managed_node3 30582 1726855316.56121: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855316.56126: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855316.56138: getting variables 30582 1726855316.56140: in VariableManager get_vars() 30582 1726855316.56175: Calling all_inventory to load vars for managed_node3 30582 1726855316.56178: Calling groups_inventory to load vars for managed_node3 30582 1726855316.56180: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855316.56191: Calling all_plugins_play to load vars for managed_node3 30582 1726855316.56194: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855316.56198: Calling groups_plugins_play to load vars for managed_node3 30582 1726855316.57551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855316.59140: done with get_vars() 30582 1726855316.59185: done getting variables 30582 1726855316.59247: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:01:56 -0400 (0:00:00.059) 0:00:52.942 ****** 30582 1726855316.59294: entering _queue_task() for managed_node3/fail 30582 1726855316.59675: worker is 1 (out of 1 available) 30582 1726855316.59891: exiting _queue_task() for managed_node3/fail 30582 1726855316.59905: done queuing things up, now waiting for results queue to drain 30582 1726855316.59907: waiting for pending results... 30582 1726855316.60137: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855316.60402: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010f7 30582 1726855316.60432: variable 'ansible_search_path' from source: unknown 30582 1726855316.60440: variable 'ansible_search_path' from source: unknown 30582 1726855316.60766: calling self._execute() 30582 1726855316.60773: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855316.60776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855316.60779: variable 'omit' from source: magic vars 30582 1726855316.61848: variable 'ansible_distribution_major_version' from source: facts 30582 1726855316.61852: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855316.62095: variable 'network_state' from source: role '' defaults 30582 1726855316.62098: Evaluated conditional (network_state != {}): False 30582 1726855316.62101: when evaluation is False, skipping this task 30582 1726855316.62102: _execute() done 30582 1726855316.62105: dumping result to json 30582 1726855316.62106: done dumping result, returning 30582 1726855316.62108: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-0000000010f7] 30582 1726855316.62110: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f7 30582 1726855316.62181: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f7 30582 1726855316.62183: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855316.62230: no more pending results, returning what we have 30582 1726855316.62233: results queue empty 30582 1726855316.62234: checking for any_errors_fatal 30582 1726855316.62240: done checking for any_errors_fatal 30582 1726855316.62241: checking for max_fail_percentage 30582 1726855316.62243: done checking for max_fail_percentage 30582 1726855316.62244: checking to see if all hosts have failed and the running result is not ok 30582 1726855316.62244: done checking to see if all hosts have failed 30582 1726855316.62245: getting the remaining hosts for this loop 30582 1726855316.62246: done getting the remaining hosts for this loop 30582 1726855316.62249: getting the next task for host managed_node3 30582 1726855316.62256: done getting next task for host managed_node3 30582 1726855316.62260: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855316.62265: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855316.62285: getting variables 30582 1726855316.62289: in VariableManager get_vars() 30582 1726855316.62325: Calling all_inventory to load vars for managed_node3 30582 1726855316.62328: Calling groups_inventory to load vars for managed_node3 30582 1726855316.62330: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855316.62341: Calling all_plugins_play to load vars for managed_node3 30582 1726855316.62345: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855316.62348: Calling groups_plugins_play to load vars for managed_node3 30582 1726855316.64860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855316.67019: done with get_vars() 30582 1726855316.67053: done getting variables 30582 1726855316.67116: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:01:56 -0400 (0:00:00.078) 0:00:53.021 ****** 30582 1726855316.67153: entering _queue_task() for managed_node3/fail 30582 1726855316.67508: worker is 1 (out of 1 available) 30582 1726855316.67522: exiting _queue_task() for managed_node3/fail 30582 1726855316.67535: done queuing things up, now waiting for results queue to drain 30582 1726855316.67537: waiting for pending results... 30582 1726855316.68053: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855316.68058: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010f8 30582 1726855316.68062: variable 'ansible_search_path' from source: unknown 30582 1726855316.68064: variable 'ansible_search_path' from source: unknown 30582 1726855316.68067: calling self._execute() 30582 1726855316.68111: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855316.68116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855316.68132: variable 'omit' from source: magic vars 30582 1726855316.68824: variable 'ansible_distribution_major_version' from source: facts 30582 1726855316.68837: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855316.68985: variable 'network_state' from source: role '' defaults 30582 1726855316.69146: Evaluated conditional (network_state != {}): False 30582 1726855316.69149: when evaluation is False, skipping this task 30582 1726855316.69152: _execute() done 30582 1726855316.69155: dumping result to json 30582 1726855316.69157: done dumping result, returning 30582 1726855316.69165: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-0000000010f8] 30582 1726855316.69174: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f8 30582 1726855316.69320: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f8 30582 1726855316.69323: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855316.69397: no more pending results, returning what we have 30582 1726855316.69402: results queue empty 30582 1726855316.69403: checking for any_errors_fatal 30582 1726855316.69412: done checking for any_errors_fatal 30582 1726855316.69412: checking for max_fail_percentage 30582 1726855316.69415: done checking for max_fail_percentage 30582 1726855316.69416: checking to see if all hosts have failed and the running result is not ok 30582 1726855316.69416: done checking to see if all hosts have failed 30582 1726855316.69417: getting the remaining hosts for this loop 30582 1726855316.69419: done getting the remaining hosts for this loop 30582 1726855316.69423: getting the next task for host managed_node3 30582 1726855316.69432: done getting next task for host managed_node3 30582 1726855316.69436: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855316.69442: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855316.69467: getting variables 30582 1726855316.69469: in VariableManager get_vars() 30582 1726855316.69510: Calling all_inventory to load vars for managed_node3 30582 1726855316.69513: Calling groups_inventory to load vars for managed_node3 30582 1726855316.69516: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855316.69528: Calling all_plugins_play to load vars for managed_node3 30582 1726855316.69531: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855316.69535: Calling groups_plugins_play to load vars for managed_node3 30582 1726855316.72536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855316.74005: done with get_vars() 30582 1726855316.74030: done getting variables 30582 1726855316.74099: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:01:56 -0400 (0:00:00.069) 0:00:53.091 ****** 30582 1726855316.74137: entering _queue_task() for managed_node3/fail 30582 1726855316.74943: worker is 1 (out of 1 available) 30582 1726855316.74956: exiting _queue_task() for managed_node3/fail 30582 1726855316.74967: done queuing things up, now waiting for results queue to drain 30582 1726855316.74969: waiting for pending results... 30582 1726855316.76230: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855316.76634: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010f9 30582 1726855316.76657: variable 'ansible_search_path' from source: unknown 30582 1726855316.76666: variable 'ansible_search_path' from source: unknown 30582 1726855316.76709: calling self._execute() 30582 1726855316.76924: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855316.76991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855316.77292: variable 'omit' from source: magic vars 30582 1726855316.77823: variable 'ansible_distribution_major_version' from source: facts 30582 1726855316.78094: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855316.78257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855316.87208: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855316.87252: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855316.87278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855316.87304: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855316.87348: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855316.87482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855316.87486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855316.87491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855316.87493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855316.87555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855316.87599: variable 'ansible_distribution_major_version' from source: facts 30582 1726855316.87602: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855316.87892: variable 'ansible_distribution' from source: facts 30582 1726855316.87896: variable '__network_rh_distros' from source: role '' defaults 30582 1726855316.87898: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855316.87958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855316.87980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855316.88008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855316.88043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855316.88056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855316.88103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855316.88128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855316.88148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855316.88185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855316.88201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855316.88235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855316.88256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855316.88279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855316.88318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855316.88331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855316.88623: variable 'network_connections' from source: include params 30582 1726855316.88634: variable 'interface' from source: play vars 30582 1726855316.88697: variable 'interface' from source: play vars 30582 1726855316.88706: variable 'network_state' from source: role '' defaults 30582 1726855316.88769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855316.88878: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855316.88911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855316.88933: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855316.88954: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855316.88995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855316.89018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855316.89040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855316.89058: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855316.89088: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855316.89092: when evaluation is False, skipping this task 30582 1726855316.89094: _execute() done 30582 1726855316.89104: dumping result to json 30582 1726855316.89107: done dumping result, returning 30582 1726855316.89109: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-0000000010f9] 30582 1726855316.89113: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f9 30582 1726855316.89197: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010f9 30582 1726855316.89200: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855316.89259: no more pending results, returning what we have 30582 1726855316.89262: results queue empty 30582 1726855316.89263: checking for any_errors_fatal 30582 1726855316.89273: done checking for any_errors_fatal 30582 1726855316.89274: checking for max_fail_percentage 30582 1726855316.89276: done checking for max_fail_percentage 30582 1726855316.89277: checking to see if all hosts have failed and the running result is not ok 30582 1726855316.89277: done checking to see if all hosts have failed 30582 1726855316.89278: getting the remaining hosts for this loop 30582 1726855316.89279: done getting the remaining hosts for this loop 30582 1726855316.89283: getting the next task for host managed_node3 30582 1726855316.89293: done getting next task for host managed_node3 30582 1726855316.89297: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855316.89302: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855316.89328: getting variables 30582 1726855316.89330: in VariableManager get_vars() 30582 1726855316.89364: Calling all_inventory to load vars for managed_node3 30582 1726855316.89366: Calling groups_inventory to load vars for managed_node3 30582 1726855316.89368: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855316.89379: Calling all_plugins_play to load vars for managed_node3 30582 1726855316.89382: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855316.89384: Calling groups_plugins_play to load vars for managed_node3 30582 1726855316.95770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855316.96955: done with get_vars() 30582 1726855316.96977: done getting variables 30582 1726855316.97014: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:01:56 -0400 (0:00:00.228) 0:00:53.320 ****** 30582 1726855316.97035: entering _queue_task() for managed_node3/dnf 30582 1726855316.97308: worker is 1 (out of 1 available) 30582 1726855316.97323: exiting _queue_task() for managed_node3/dnf 30582 1726855316.97334: done queuing things up, now waiting for results queue to drain 30582 1726855316.97337: waiting for pending results... 30582 1726855316.97529: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855316.97638: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010fa 30582 1726855316.97650: variable 'ansible_search_path' from source: unknown 30582 1726855316.97655: variable 'ansible_search_path' from source: unknown 30582 1726855316.97691: calling self._execute() 30582 1726855316.97765: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855316.97770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855316.97783: variable 'omit' from source: magic vars 30582 1726855316.98068: variable 'ansible_distribution_major_version' from source: facts 30582 1726855316.98081: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855316.98225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855317.00654: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855317.00749: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855317.00822: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855317.00837: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855317.00856: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855317.00927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.00949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.00997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.01048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.01054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.01161: variable 'ansible_distribution' from source: facts 30582 1726855317.01167: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.01179: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855317.01264: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855317.01356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.01374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.01395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.01420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.01431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.01460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.01481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.01502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.01525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.01535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.01562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.01582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.01602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.01629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.01640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.01748: variable 'network_connections' from source: include params 30582 1726855317.01758: variable 'interface' from source: play vars 30582 1726855317.01819: variable 'interface' from source: play vars 30582 1726855317.01869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855317.01994: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855317.02022: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855317.02045: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855317.02067: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855317.02101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855317.02119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855317.02141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.02160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855317.02210: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855317.02437: variable 'network_connections' from source: include params 30582 1726855317.02443: variable 'interface' from source: play vars 30582 1726855317.02528: variable 'interface' from source: play vars 30582 1726855317.02546: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855317.02583: when evaluation is False, skipping this task 30582 1726855317.02586: _execute() done 30582 1726855317.02590: dumping result to json 30582 1726855317.02592: done dumping result, returning 30582 1726855317.02595: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000010fa] 30582 1726855317.02597: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fa 30582 1726855317.02705: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fa 30582 1726855317.02709: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855317.02759: no more pending results, returning what we have 30582 1726855317.02764: results queue empty 30582 1726855317.02765: checking for any_errors_fatal 30582 1726855317.02774: done checking for any_errors_fatal 30582 1726855317.02775: checking for max_fail_percentage 30582 1726855317.02777: done checking for max_fail_percentage 30582 1726855317.02778: checking to see if all hosts have failed and the running result is not ok 30582 1726855317.02778: done checking to see if all hosts have failed 30582 1726855317.02779: getting the remaining hosts for this loop 30582 1726855317.02780: done getting the remaining hosts for this loop 30582 1726855317.02784: getting the next task for host managed_node3 30582 1726855317.02794: done getting next task for host managed_node3 30582 1726855317.02798: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855317.02803: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855317.02823: getting variables 30582 1726855317.02825: in VariableManager get_vars() 30582 1726855317.02862: Calling all_inventory to load vars for managed_node3 30582 1726855317.02865: Calling groups_inventory to load vars for managed_node3 30582 1726855317.02867: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855317.02877: Calling all_plugins_play to load vars for managed_node3 30582 1726855317.02880: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855317.02882: Calling groups_plugins_play to load vars for managed_node3 30582 1726855317.03774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855317.04986: done with get_vars() 30582 1726855317.05007: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855317.05070: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:01:57 -0400 (0:00:00.080) 0:00:53.401 ****** 30582 1726855317.05111: entering _queue_task() for managed_node3/yum 30582 1726855317.05445: worker is 1 (out of 1 available) 30582 1726855317.05458: exiting _queue_task() for managed_node3/yum 30582 1726855317.05472: done queuing things up, now waiting for results queue to drain 30582 1726855317.05474: waiting for pending results... 30582 1726855317.05738: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855317.05920: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010fb 30582 1726855317.05931: variable 'ansible_search_path' from source: unknown 30582 1726855317.05935: variable 'ansible_search_path' from source: unknown 30582 1726855317.06019: calling self._execute() 30582 1726855317.06104: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855317.06108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855317.06112: variable 'omit' from source: magic vars 30582 1726855317.06494: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.06505: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855317.06672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855317.08280: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855317.08338: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855317.08432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855317.08478: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855317.08504: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855317.08594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.08615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.08632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.08657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.08668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.08744: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.08757: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855317.08761: when evaluation is False, skipping this task 30582 1726855317.08764: _execute() done 30582 1726855317.08767: dumping result to json 30582 1726855317.08769: done dumping result, returning 30582 1726855317.08780: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000010fb] 30582 1726855317.08783: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fb 30582 1726855317.08880: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fb 30582 1726855317.08883: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855317.08953: no more pending results, returning what we have 30582 1726855317.08957: results queue empty 30582 1726855317.08958: checking for any_errors_fatal 30582 1726855317.08966: done checking for any_errors_fatal 30582 1726855317.08966: checking for max_fail_percentage 30582 1726855317.08968: done checking for max_fail_percentage 30582 1726855317.08969: checking to see if all hosts have failed and the running result is not ok 30582 1726855317.08970: done checking to see if all hosts have failed 30582 1726855317.08971: getting the remaining hosts for this loop 30582 1726855317.08972: done getting the remaining hosts for this loop 30582 1726855317.08976: getting the next task for host managed_node3 30582 1726855317.08984: done getting next task for host managed_node3 30582 1726855317.08990: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855317.08996: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855317.09018: getting variables 30582 1726855317.09020: in VariableManager get_vars() 30582 1726855317.09058: Calling all_inventory to load vars for managed_node3 30582 1726855317.09062: Calling groups_inventory to load vars for managed_node3 30582 1726855317.09064: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855317.09074: Calling all_plugins_play to load vars for managed_node3 30582 1726855317.09077: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855317.09079: Calling groups_plugins_play to load vars for managed_node3 30582 1726855317.10122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855317.11690: done with get_vars() 30582 1726855317.11715: done getting variables 30582 1726855317.11778: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:01:57 -0400 (0:00:00.067) 0:00:53.468 ****** 30582 1726855317.11824: entering _queue_task() for managed_node3/fail 30582 1726855317.12096: worker is 1 (out of 1 available) 30582 1726855317.12112: exiting _queue_task() for managed_node3/fail 30582 1726855317.12123: done queuing things up, now waiting for results queue to drain 30582 1726855317.12125: waiting for pending results... 30582 1726855317.12320: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855317.12424: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010fc 30582 1726855317.12435: variable 'ansible_search_path' from source: unknown 30582 1726855317.12438: variable 'ansible_search_path' from source: unknown 30582 1726855317.12472: calling self._execute() 30582 1726855317.12544: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855317.12548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855317.12555: variable 'omit' from source: magic vars 30582 1726855317.12838: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.12847: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855317.12936: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855317.13065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855317.15067: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855317.15523: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855317.15569: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855317.15651: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855317.15694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855317.15807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.15853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.15901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.15958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.15979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.16050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.16071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.16116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.16270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.16273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.16276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.16278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.16325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.16375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.16408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.16722: variable 'network_connections' from source: include params 30582 1726855317.16725: variable 'interface' from source: play vars 30582 1726855317.16773: variable 'interface' from source: play vars 30582 1726855317.16865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855317.17047: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855317.17096: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855317.17135: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855317.17178: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855317.17249: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855317.17354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855317.17358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.17360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855317.17451: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855317.17720: variable 'network_connections' from source: include params 30582 1726855317.17724: variable 'interface' from source: play vars 30582 1726855317.17769: variable 'interface' from source: play vars 30582 1726855317.17798: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855317.17802: when evaluation is False, skipping this task 30582 1726855317.17806: _execute() done 30582 1726855317.17814: dumping result to json 30582 1726855317.17818: done dumping result, returning 30582 1726855317.17821: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000010fc] 30582 1726855317.17823: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fc 30582 1726855317.17918: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fc 30582 1726855317.17921: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855317.17995: no more pending results, returning what we have 30582 1726855317.17998: results queue empty 30582 1726855317.17999: checking for any_errors_fatal 30582 1726855317.18006: done checking for any_errors_fatal 30582 1726855317.18007: checking for max_fail_percentage 30582 1726855317.18009: done checking for max_fail_percentage 30582 1726855317.18010: checking to see if all hosts have failed and the running result is not ok 30582 1726855317.18010: done checking to see if all hosts have failed 30582 1726855317.18011: getting the remaining hosts for this loop 30582 1726855317.18012: done getting the remaining hosts for this loop 30582 1726855317.18016: getting the next task for host managed_node3 30582 1726855317.18024: done getting next task for host managed_node3 30582 1726855317.18028: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855317.18032: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855317.18051: getting variables 30582 1726855317.18052: in VariableManager get_vars() 30582 1726855317.18094: Calling all_inventory to load vars for managed_node3 30582 1726855317.18097: Calling groups_inventory to load vars for managed_node3 30582 1726855317.18099: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855317.18108: Calling all_plugins_play to load vars for managed_node3 30582 1726855317.18110: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855317.18113: Calling groups_plugins_play to load vars for managed_node3 30582 1726855317.19110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855317.20504: done with get_vars() 30582 1726855317.20527: done getting variables 30582 1726855317.20583: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:01:57 -0400 (0:00:00.087) 0:00:53.556 ****** 30582 1726855317.20614: entering _queue_task() for managed_node3/package 30582 1726855317.20880: worker is 1 (out of 1 available) 30582 1726855317.20897: exiting _queue_task() for managed_node3/package 30582 1726855317.20910: done queuing things up, now waiting for results queue to drain 30582 1726855317.20912: waiting for pending results... 30582 1726855317.21096: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855317.21202: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010fd 30582 1726855317.21213: variable 'ansible_search_path' from source: unknown 30582 1726855317.21217: variable 'ansible_search_path' from source: unknown 30582 1726855317.21251: calling self._execute() 30582 1726855317.21318: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855317.21321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855317.21328: variable 'omit' from source: magic vars 30582 1726855317.21605: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.21613: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855317.21746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855317.21949: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855317.21984: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855317.22015: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855317.22065: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855317.22154: variable 'network_packages' from source: role '' defaults 30582 1726855317.22231: variable '__network_provider_setup' from source: role '' defaults 30582 1726855317.22236: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855317.22285: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855317.22302: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855317.22349: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855317.22691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855317.24441: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855317.24492: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855317.24521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855317.24545: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855317.24568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855317.24640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.24661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.24685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.24712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.24723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.24755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.24771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.24795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.24819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.24829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.24985: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855317.25063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.25083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.25105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.25130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.25140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.25206: variable 'ansible_python' from source: facts 30582 1726855317.25222: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855317.25290: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855317.25364: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855317.25497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.25516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.25533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.25583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.25593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.25892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.25905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.25908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.25910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.25913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.25915: variable 'network_connections' from source: include params 30582 1726855317.25917: variable 'interface' from source: play vars 30582 1726855317.26004: variable 'interface' from source: play vars 30582 1726855317.26080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855317.26107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855317.26150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.26185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855317.26229: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855317.26474: variable 'network_connections' from source: include params 30582 1726855317.26478: variable 'interface' from source: play vars 30582 1726855317.26564: variable 'interface' from source: play vars 30582 1726855317.26629: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855317.26704: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855317.26979: variable 'network_connections' from source: include params 30582 1726855317.26982: variable 'interface' from source: play vars 30582 1726855317.27043: variable 'interface' from source: play vars 30582 1726855317.27131: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855317.27140: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855317.27427: variable 'network_connections' from source: include params 30582 1726855317.27430: variable 'interface' from source: play vars 30582 1726855317.27494: variable 'interface' from source: play vars 30582 1726855317.27554: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855317.27610: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855317.27617: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855317.27678: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855317.27837: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855317.28143: variable 'network_connections' from source: include params 30582 1726855317.28147: variable 'interface' from source: play vars 30582 1726855317.28190: variable 'interface' from source: play vars 30582 1726855317.28198: variable 'ansible_distribution' from source: facts 30582 1726855317.28201: variable '__network_rh_distros' from source: role '' defaults 30582 1726855317.28210: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.28234: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855317.28341: variable 'ansible_distribution' from source: facts 30582 1726855317.28344: variable '__network_rh_distros' from source: role '' defaults 30582 1726855317.28349: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.28358: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855317.28463: variable 'ansible_distribution' from source: facts 30582 1726855317.28467: variable '__network_rh_distros' from source: role '' defaults 30582 1726855317.28470: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.28498: variable 'network_provider' from source: set_fact 30582 1726855317.28510: variable 'ansible_facts' from source: unknown 30582 1726855317.29189: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855317.29193: when evaluation is False, skipping this task 30582 1726855317.29195: _execute() done 30582 1726855317.29197: dumping result to json 30582 1726855317.29206: done dumping result, returning 30582 1726855317.29209: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-0000000010fd] 30582 1726855317.29212: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fd 30582 1726855317.29275: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fd 30582 1726855317.29277: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855317.29341: no more pending results, returning what we have 30582 1726855317.29345: results queue empty 30582 1726855317.29346: checking for any_errors_fatal 30582 1726855317.29354: done checking for any_errors_fatal 30582 1726855317.29354: checking for max_fail_percentage 30582 1726855317.29356: done checking for max_fail_percentage 30582 1726855317.29357: checking to see if all hosts have failed and the running result is not ok 30582 1726855317.29358: done checking to see if all hosts have failed 30582 1726855317.29359: getting the remaining hosts for this loop 30582 1726855317.29360: done getting the remaining hosts for this loop 30582 1726855317.29364: getting the next task for host managed_node3 30582 1726855317.29372: done getting next task for host managed_node3 30582 1726855317.29375: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855317.29380: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855317.29401: getting variables 30582 1726855317.29403: in VariableManager get_vars() 30582 1726855317.29618: Calling all_inventory to load vars for managed_node3 30582 1726855317.29621: Calling groups_inventory to load vars for managed_node3 30582 1726855317.29624: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855317.29633: Calling all_plugins_play to load vars for managed_node3 30582 1726855317.29636: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855317.29639: Calling groups_plugins_play to load vars for managed_node3 30582 1726855317.31019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855317.31918: done with get_vars() 30582 1726855317.31937: done getting variables 30582 1726855317.31985: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:01:57 -0400 (0:00:00.113) 0:00:53.670 ****** 30582 1726855317.32013: entering _queue_task() for managed_node3/package 30582 1726855317.32320: worker is 1 (out of 1 available) 30582 1726855317.32331: exiting _queue_task() for managed_node3/package 30582 1726855317.32344: done queuing things up, now waiting for results queue to drain 30582 1726855317.32346: waiting for pending results... 30582 1726855317.32717: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855317.32993: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010fe 30582 1726855317.32998: variable 'ansible_search_path' from source: unknown 30582 1726855317.33001: variable 'ansible_search_path' from source: unknown 30582 1726855317.33004: calling self._execute() 30582 1726855317.33006: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855317.33008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855317.33010: variable 'omit' from source: magic vars 30582 1726855317.33585: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.33621: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855317.33815: variable 'network_state' from source: role '' defaults 30582 1726855317.33865: Evaluated conditional (network_state != {}): False 30582 1726855317.33894: when evaluation is False, skipping this task 30582 1726855317.33912: _execute() done 30582 1726855317.33919: dumping result to json 30582 1726855317.33958: done dumping result, returning 30582 1726855317.33978: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-0000000010fe] 30582 1726855317.33983: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fe 30582 1726855317.34261: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010fe 30582 1726855317.34264: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855317.34331: no more pending results, returning what we have 30582 1726855317.34336: results queue empty 30582 1726855317.34337: checking for any_errors_fatal 30582 1726855317.34342: done checking for any_errors_fatal 30582 1726855317.34343: checking for max_fail_percentage 30582 1726855317.34345: done checking for max_fail_percentage 30582 1726855317.34346: checking to see if all hosts have failed and the running result is not ok 30582 1726855317.34347: done checking to see if all hosts have failed 30582 1726855317.34347: getting the remaining hosts for this loop 30582 1726855317.34349: done getting the remaining hosts for this loop 30582 1726855317.34358: getting the next task for host managed_node3 30582 1726855317.34368: done getting next task for host managed_node3 30582 1726855317.34375: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855317.34381: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855317.34416: getting variables 30582 1726855317.34418: in VariableManager get_vars() 30582 1726855317.34474: Calling all_inventory to load vars for managed_node3 30582 1726855317.34477: Calling groups_inventory to load vars for managed_node3 30582 1726855317.34480: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855317.34682: Calling all_plugins_play to load vars for managed_node3 30582 1726855317.34688: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855317.34693: Calling groups_plugins_play to load vars for managed_node3 30582 1726855317.37116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855317.38883: done with get_vars() 30582 1726855317.38912: done getting variables 30582 1726855317.38976: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:01:57 -0400 (0:00:00.069) 0:00:53.740 ****** 30582 1726855317.39014: entering _queue_task() for managed_node3/package 30582 1726855317.39392: worker is 1 (out of 1 available) 30582 1726855317.39404: exiting _queue_task() for managed_node3/package 30582 1726855317.39419: done queuing things up, now waiting for results queue to drain 30582 1726855317.39420: waiting for pending results... 30582 1726855317.39765: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855317.39907: in run() - task 0affcc66-ac2b-aa83-7d57-0000000010ff 30582 1726855317.39934: variable 'ansible_search_path' from source: unknown 30582 1726855317.39939: variable 'ansible_search_path' from source: unknown 30582 1726855317.39976: calling self._execute() 30582 1726855317.40063: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855317.40067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855317.40078: variable 'omit' from source: magic vars 30582 1726855317.40364: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.40373: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855317.40460: variable 'network_state' from source: role '' defaults 30582 1726855317.40469: Evaluated conditional (network_state != {}): False 30582 1726855317.40472: when evaluation is False, skipping this task 30582 1726855317.40477: _execute() done 30582 1726855317.40480: dumping result to json 30582 1726855317.40484: done dumping result, returning 30582 1726855317.40493: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-0000000010ff] 30582 1726855317.40498: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010ff 30582 1726855317.40586: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000010ff 30582 1726855317.40591: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855317.40656: no more pending results, returning what we have 30582 1726855317.40660: results queue empty 30582 1726855317.40662: checking for any_errors_fatal 30582 1726855317.40668: done checking for any_errors_fatal 30582 1726855317.40668: checking for max_fail_percentage 30582 1726855317.40670: done checking for max_fail_percentage 30582 1726855317.40671: checking to see if all hosts have failed and the running result is not ok 30582 1726855317.40672: done checking to see if all hosts have failed 30582 1726855317.40673: getting the remaining hosts for this loop 30582 1726855317.40674: done getting the remaining hosts for this loop 30582 1726855317.40678: getting the next task for host managed_node3 30582 1726855317.40685: done getting next task for host managed_node3 30582 1726855317.40691: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855317.40696: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855317.40717: getting variables 30582 1726855317.40719: in VariableManager get_vars() 30582 1726855317.40751: Calling all_inventory to load vars for managed_node3 30582 1726855317.40755: Calling groups_inventory to load vars for managed_node3 30582 1726855317.40758: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855317.40772: Calling all_plugins_play to load vars for managed_node3 30582 1726855317.40775: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855317.40780: Calling groups_plugins_play to load vars for managed_node3 30582 1726855317.42336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855317.43937: done with get_vars() 30582 1726855317.43956: done getting variables 30582 1726855317.44016: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:01:57 -0400 (0:00:00.050) 0:00:53.790 ****** 30582 1726855317.44043: entering _queue_task() for managed_node3/service 30582 1726855317.44309: worker is 1 (out of 1 available) 30582 1726855317.44322: exiting _queue_task() for managed_node3/service 30582 1726855317.44335: done queuing things up, now waiting for results queue to drain 30582 1726855317.44337: waiting for pending results... 30582 1726855317.44612: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855317.44723: in run() - task 0affcc66-ac2b-aa83-7d57-000000001100 30582 1726855317.44727: variable 'ansible_search_path' from source: unknown 30582 1726855317.44731: variable 'ansible_search_path' from source: unknown 30582 1726855317.44780: calling self._execute() 30582 1726855317.44878: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855317.44883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855317.44894: variable 'omit' from source: magic vars 30582 1726855317.45209: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.45218: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855317.45308: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855317.45445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855317.48010: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855317.48077: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855317.48139: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855317.48160: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855317.48191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855317.48264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.48313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.48333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.48374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.48394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.48460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.48474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.48503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.48526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.48536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.48564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.48586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.48616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.48637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.48647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.48781: variable 'network_connections' from source: include params 30582 1726855317.48798: variable 'interface' from source: play vars 30582 1726855317.48867: variable 'interface' from source: play vars 30582 1726855317.48970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855317.49292: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855317.49296: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855317.49298: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855317.49301: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855317.49303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855317.49305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855317.49315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.49336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855317.49402: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855317.49676: variable 'network_connections' from source: include params 30582 1726855317.49684: variable 'interface' from source: play vars 30582 1726855317.49768: variable 'interface' from source: play vars 30582 1726855317.49780: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855317.49784: when evaluation is False, skipping this task 30582 1726855317.49786: _execute() done 30582 1726855317.49791: dumping result to json 30582 1726855317.49794: done dumping result, returning 30582 1726855317.49801: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001100] 30582 1726855317.49806: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001100 30582 1726855317.50034: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001100 30582 1726855317.50044: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855317.50127: no more pending results, returning what we have 30582 1726855317.50131: results queue empty 30582 1726855317.50132: checking for any_errors_fatal 30582 1726855317.50138: done checking for any_errors_fatal 30582 1726855317.50139: checking for max_fail_percentage 30582 1726855317.50141: done checking for max_fail_percentage 30582 1726855317.50142: checking to see if all hosts have failed and the running result is not ok 30582 1726855317.50142: done checking to see if all hosts have failed 30582 1726855317.50143: getting the remaining hosts for this loop 30582 1726855317.50144: done getting the remaining hosts for this loop 30582 1726855317.50148: getting the next task for host managed_node3 30582 1726855317.50156: done getting next task for host managed_node3 30582 1726855317.50160: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855317.50165: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855317.50188: getting variables 30582 1726855317.50190: in VariableManager get_vars() 30582 1726855317.50222: Calling all_inventory to load vars for managed_node3 30582 1726855317.50225: Calling groups_inventory to load vars for managed_node3 30582 1726855317.50226: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855317.50235: Calling all_plugins_play to load vars for managed_node3 30582 1726855317.50237: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855317.50239: Calling groups_plugins_play to load vars for managed_node3 30582 1726855317.52231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855317.54132: done with get_vars() 30582 1726855317.54166: done getting variables 30582 1726855317.54234: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:01:57 -0400 (0:00:00.102) 0:00:53.892 ****** 30582 1726855317.54274: entering _queue_task() for managed_node3/service 30582 1726855317.54596: worker is 1 (out of 1 available) 30582 1726855317.54612: exiting _queue_task() for managed_node3/service 30582 1726855317.54625: done queuing things up, now waiting for results queue to drain 30582 1726855317.54627: waiting for pending results... 30582 1726855317.54814: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855317.54910: in run() - task 0affcc66-ac2b-aa83-7d57-000000001101 30582 1726855317.54922: variable 'ansible_search_path' from source: unknown 30582 1726855317.54925: variable 'ansible_search_path' from source: unknown 30582 1726855317.54953: calling self._execute() 30582 1726855317.55028: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855317.55032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855317.55040: variable 'omit' from source: magic vars 30582 1726855317.55378: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.55389: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855317.55520: variable 'network_provider' from source: set_fact 30582 1726855317.55524: variable 'network_state' from source: role '' defaults 30582 1726855317.55534: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855317.55540: variable 'omit' from source: magic vars 30582 1726855317.55584: variable 'omit' from source: magic vars 30582 1726855317.55606: variable 'network_service_name' from source: role '' defaults 30582 1726855317.55893: variable 'network_service_name' from source: role '' defaults 30582 1726855317.55896: variable '__network_provider_setup' from source: role '' defaults 30582 1726855317.55899: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855317.55902: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855317.55904: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855317.55907: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855317.56116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855317.58019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855317.58070: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855317.58101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855317.58128: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855317.58147: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855317.58210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.58232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.58249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.58277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.58289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.58325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.58341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.58357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.58383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.58395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.58548: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855317.58626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.58646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.58663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.58690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.58705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.58768: variable 'ansible_python' from source: facts 30582 1726855317.58782: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855317.58839: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855317.58898: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855317.58979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.58999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.59015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.59039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.59049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.59088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855317.59108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855317.59124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.59147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855317.59157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855317.59251: variable 'network_connections' from source: include params 30582 1726855317.59257: variable 'interface' from source: play vars 30582 1726855317.59313: variable 'interface' from source: play vars 30582 1726855317.59444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855317.60037: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855317.60097: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855317.60193: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855317.60203: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855317.60271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855317.60325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855317.60348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855317.60379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855317.60419: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855317.60612: variable 'network_connections' from source: include params 30582 1726855317.60617: variable 'interface' from source: play vars 30582 1726855317.60669: variable 'interface' from source: play vars 30582 1726855317.60711: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855317.60763: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855317.60952: variable 'network_connections' from source: include params 30582 1726855317.60956: variable 'interface' from source: play vars 30582 1726855317.61009: variable 'interface' from source: play vars 30582 1726855317.61029: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855317.61083: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855317.61266: variable 'network_connections' from source: include params 30582 1726855317.61269: variable 'interface' from source: play vars 30582 1726855317.61321: variable 'interface' from source: play vars 30582 1726855317.61366: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855317.61410: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855317.61416: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855317.61460: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855317.61597: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855317.61902: variable 'network_connections' from source: include params 30582 1726855317.61906: variable 'interface' from source: play vars 30582 1726855317.61948: variable 'interface' from source: play vars 30582 1726855317.61955: variable 'ansible_distribution' from source: facts 30582 1726855317.61958: variable '__network_rh_distros' from source: role '' defaults 30582 1726855317.61964: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.61989: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855317.62103: variable 'ansible_distribution' from source: facts 30582 1726855317.62107: variable '__network_rh_distros' from source: role '' defaults 30582 1726855317.62110: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.62119: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855317.62226: variable 'ansible_distribution' from source: facts 30582 1726855317.62230: variable '__network_rh_distros' from source: role '' defaults 30582 1726855317.62237: variable 'ansible_distribution_major_version' from source: facts 30582 1726855317.62261: variable 'network_provider' from source: set_fact 30582 1726855317.62279: variable 'omit' from source: magic vars 30582 1726855317.62301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855317.62322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855317.62337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855317.62352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855317.62360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855317.62383: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855317.62386: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855317.62390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855317.62493: Set connection var ansible_timeout to 10 30582 1726855317.62496: Set connection var ansible_connection to ssh 30582 1726855317.62499: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855317.62501: Set connection var ansible_pipelining to False 30582 1726855317.62503: Set connection var ansible_shell_executable to /bin/sh 30582 1726855317.62505: Set connection var ansible_shell_type to sh 30582 1726855317.62534: variable 'ansible_shell_executable' from source: unknown 30582 1726855317.62537: variable 'ansible_connection' from source: unknown 30582 1726855317.62539: variable 'ansible_module_compression' from source: unknown 30582 1726855317.62557: variable 'ansible_shell_type' from source: unknown 30582 1726855317.62574: variable 'ansible_shell_executable' from source: unknown 30582 1726855317.62577: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855317.62579: variable 'ansible_pipelining' from source: unknown 30582 1726855317.62581: variable 'ansible_timeout' from source: unknown 30582 1726855317.62626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855317.62682: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855317.62692: variable 'omit' from source: magic vars 30582 1726855317.62715: starting attempt loop 30582 1726855317.62718: running the handler 30582 1726855317.62792: variable 'ansible_facts' from source: unknown 30582 1726855317.63474: _low_level_execute_command(): starting 30582 1726855317.63483: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855317.64216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855317.64220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855317.64228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855317.64232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855317.64341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855317.64355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855317.64432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855317.66103: stdout chunk (state=3): >>>/root <<< 30582 1726855317.66274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855317.66277: stdout chunk (state=3): >>><<< 30582 1726855317.66278: stderr chunk (state=3): >>><<< 30582 1726855317.66350: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855317.66373: _low_level_execute_command(): starting 30582 1726855317.66377: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139 `" && echo ansible-tmp-1726855317.6631155-33153-13036679342139="` echo /root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139 `" ) && sleep 0' 30582 1726855317.66908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855317.66911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855317.66914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855317.66916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855317.66964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855317.66968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855317.66985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855317.67056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855317.68971: stdout chunk (state=3): >>>ansible-tmp-1726855317.6631155-33153-13036679342139=/root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139 <<< 30582 1726855317.69104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855317.69109: stdout chunk (state=3): >>><<< 30582 1726855317.69112: stderr chunk (state=3): >>><<< 30582 1726855317.69129: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855317.6631155-33153-13036679342139=/root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855317.69156: variable 'ansible_module_compression' from source: unknown 30582 1726855317.69202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855317.69250: variable 'ansible_facts' from source: unknown 30582 1726855317.69388: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/AnsiballZ_systemd.py 30582 1726855317.69494: Sending initial data 30582 1726855317.69497: Sent initial data (155 bytes) 30582 1726855317.69967: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855317.69973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855317.69983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855317.69985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855317.69989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855317.70032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855317.70035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855317.70042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855317.70129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855317.71668: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855317.71721: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855317.71805: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpu_43ljfs /root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/AnsiballZ_systemd.py <<< 30582 1726855317.71808: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/AnsiballZ_systemd.py" <<< 30582 1726855317.71853: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpu_43ljfs" to remote "/root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/AnsiballZ_systemd.py" <<< 30582 1726855317.73206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855317.73251: stderr chunk (state=3): >>><<< 30582 1726855317.73255: stdout chunk (state=3): >>><<< 30582 1726855317.73299: done transferring module to remote 30582 1726855317.73308: _low_level_execute_command(): starting 30582 1726855317.73313: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/ /root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/AnsiballZ_systemd.py && sleep 0' 30582 1726855317.74083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855317.74093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855317.74097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855317.74238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855317.75998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855317.76048: stderr chunk (state=3): >>><<< 30582 1726855317.76052: stdout chunk (state=3): >>><<< 30582 1726855317.76065: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855317.76067: _low_level_execute_command(): starting 30582 1726855317.76074: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/AnsiballZ_systemd.py && sleep 0' 30582 1726855317.76577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855317.76581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855317.76585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855317.76629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855317.76633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855317.76698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855318.05879: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10612736", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3324506112", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2120012000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855318.05894: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855318.07820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855318.07851: stderr chunk (state=3): >>><<< 30582 1726855318.07854: stdout chunk (state=3): >>><<< 30582 1726855318.07872: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10612736", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3324506112", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2120012000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855318.08003: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855318.08019: _low_level_execute_command(): starting 30582 1726855318.08022: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855317.6631155-33153-13036679342139/ > /dev/null 2>&1 && sleep 0' 30582 1726855318.08477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855318.08480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855318.08515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855318.08518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855318.08520: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855318.08522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855318.08592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855318.08595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855318.08600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855318.08655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855318.10525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855318.10551: stderr chunk (state=3): >>><<< 30582 1726855318.10560: stdout chunk (state=3): >>><<< 30582 1726855318.10563: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855318.10606: handler run complete 30582 1726855318.10647: attempt loop complete, returning result 30582 1726855318.10651: _execute() done 30582 1726855318.10664: dumping result to json 30582 1726855318.10680: done dumping result, returning 30582 1726855318.10704: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-000000001101] 30582 1726855318.10707: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001101 30582 1726855318.11018: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001101 30582 1726855318.11021: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855318.11078: no more pending results, returning what we have 30582 1726855318.11081: results queue empty 30582 1726855318.11082: checking for any_errors_fatal 30582 1726855318.11089: done checking for any_errors_fatal 30582 1726855318.11090: checking for max_fail_percentage 30582 1726855318.11092: done checking for max_fail_percentage 30582 1726855318.11093: checking to see if all hosts have failed and the running result is not ok 30582 1726855318.11093: done checking to see if all hosts have failed 30582 1726855318.11094: getting the remaining hosts for this loop 30582 1726855318.11095: done getting the remaining hosts for this loop 30582 1726855318.11099: getting the next task for host managed_node3 30582 1726855318.11106: done getting next task for host managed_node3 30582 1726855318.11109: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855318.11115: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855318.11127: getting variables 30582 1726855318.11128: in VariableManager get_vars() 30582 1726855318.11157: Calling all_inventory to load vars for managed_node3 30582 1726855318.11160: Calling groups_inventory to load vars for managed_node3 30582 1726855318.11161: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855318.11170: Calling all_plugins_play to load vars for managed_node3 30582 1726855318.11173: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855318.11175: Calling groups_plugins_play to load vars for managed_node3 30582 1726855318.12505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855318.14686: done with get_vars() 30582 1726855318.14724: done getting variables 30582 1726855318.14996: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:01:58 -0400 (0:00:00.607) 0:00:54.500 ****** 30582 1726855318.15045: entering _queue_task() for managed_node3/service 30582 1726855318.15699: worker is 1 (out of 1 available) 30582 1726855318.15714: exiting _queue_task() for managed_node3/service 30582 1726855318.15731: done queuing things up, now waiting for results queue to drain 30582 1726855318.15737: waiting for pending results... 30582 1726855318.16052: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855318.16148: in run() - task 0affcc66-ac2b-aa83-7d57-000000001102 30582 1726855318.16161: variable 'ansible_search_path' from source: unknown 30582 1726855318.16173: variable 'ansible_search_path' from source: unknown 30582 1726855318.16201: calling self._execute() 30582 1726855318.16286: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855318.16292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855318.16298: variable 'omit' from source: magic vars 30582 1726855318.16590: variable 'ansible_distribution_major_version' from source: facts 30582 1726855318.16599: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855318.16681: variable 'network_provider' from source: set_fact 30582 1726855318.16686: Evaluated conditional (network_provider == "nm"): True 30582 1726855318.16779: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855318.16893: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855318.17019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855318.19262: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855318.19333: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855318.19374: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855318.19413: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855318.19445: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855318.19553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855318.19693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855318.19697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855318.19700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855318.19703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855318.19742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855318.19774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855318.19807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855318.19850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855318.19874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855318.19921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855318.19949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855318.19983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855318.20032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855318.20104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855318.20283: variable 'network_connections' from source: include params 30582 1726855318.20306: variable 'interface' from source: play vars 30582 1726855318.20388: variable 'interface' from source: play vars 30582 1726855318.20592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855318.20644: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855318.20693: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855318.20735: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855318.20768: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855318.21094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855318.21097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855318.21099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855318.21101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855318.21103: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855318.21374: variable 'network_connections' from source: include params 30582 1726855318.21386: variable 'interface' from source: play vars 30582 1726855318.21451: variable 'interface' from source: play vars 30582 1726855318.21507: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855318.21515: when evaluation is False, skipping this task 30582 1726855318.21521: _execute() done 30582 1726855318.21527: dumping result to json 30582 1726855318.21533: done dumping result, returning 30582 1726855318.21544: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-000000001102] 30582 1726855318.21562: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001102 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855318.21730: no more pending results, returning what we have 30582 1726855318.21734: results queue empty 30582 1726855318.21735: checking for any_errors_fatal 30582 1726855318.21763: done checking for any_errors_fatal 30582 1726855318.21764: checking for max_fail_percentage 30582 1726855318.21766: done checking for max_fail_percentage 30582 1726855318.21767: checking to see if all hosts have failed and the running result is not ok 30582 1726855318.21768: done checking to see if all hosts have failed 30582 1726855318.21769: getting the remaining hosts for this loop 30582 1726855318.21773: done getting the remaining hosts for this loop 30582 1726855318.21777: getting the next task for host managed_node3 30582 1726855318.21789: done getting next task for host managed_node3 30582 1726855318.21793: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855318.21799: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855318.21821: getting variables 30582 1726855318.21823: in VariableManager get_vars() 30582 1726855318.21865: Calling all_inventory to load vars for managed_node3 30582 1726855318.21868: Calling groups_inventory to load vars for managed_node3 30582 1726855318.21874: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855318.21885: Calling all_plugins_play to load vars for managed_node3 30582 1726855318.21994: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855318.22097: Calling groups_plugins_play to load vars for managed_node3 30582 1726855318.22796: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001102 30582 1726855318.22800: WORKER PROCESS EXITING 30582 1726855318.23650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855318.25419: done with get_vars() 30582 1726855318.25454: done getting variables 30582 1726855318.25531: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:01:58 -0400 (0:00:00.105) 0:00:54.605 ****** 30582 1726855318.25568: entering _queue_task() for managed_node3/service 30582 1726855318.26223: worker is 1 (out of 1 available) 30582 1726855318.26236: exiting _queue_task() for managed_node3/service 30582 1726855318.26249: done queuing things up, now waiting for results queue to drain 30582 1726855318.26251: waiting for pending results... 30582 1726855318.26533: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855318.26938: in run() - task 0affcc66-ac2b-aa83-7d57-000000001103 30582 1726855318.26978: variable 'ansible_search_path' from source: unknown 30582 1726855318.26994: variable 'ansible_search_path' from source: unknown 30582 1726855318.27046: calling self._execute() 30582 1726855318.27184: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855318.27248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855318.27252: variable 'omit' from source: magic vars 30582 1726855318.27622: variable 'ansible_distribution_major_version' from source: facts 30582 1726855318.27640: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855318.27764: variable 'network_provider' from source: set_fact 30582 1726855318.27781: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855318.27794: when evaluation is False, skipping this task 30582 1726855318.27802: _execute() done 30582 1726855318.27810: dumping result to json 30582 1726855318.27817: done dumping result, returning 30582 1726855318.27903: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-000000001103] 30582 1726855318.27906: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001103 30582 1726855318.27982: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001103 30582 1726855318.27985: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855318.28053: no more pending results, returning what we have 30582 1726855318.28057: results queue empty 30582 1726855318.28058: checking for any_errors_fatal 30582 1726855318.28068: done checking for any_errors_fatal 30582 1726855318.28069: checking for max_fail_percentage 30582 1726855318.28074: done checking for max_fail_percentage 30582 1726855318.28075: checking to see if all hosts have failed and the running result is not ok 30582 1726855318.28076: done checking to see if all hosts have failed 30582 1726855318.28077: getting the remaining hosts for this loop 30582 1726855318.28079: done getting the remaining hosts for this loop 30582 1726855318.28083: getting the next task for host managed_node3 30582 1726855318.28099: done getting next task for host managed_node3 30582 1726855318.28103: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855318.28109: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855318.28132: getting variables 30582 1726855318.28133: in VariableManager get_vars() 30582 1726855318.28175: Calling all_inventory to load vars for managed_node3 30582 1726855318.28177: Calling groups_inventory to load vars for managed_node3 30582 1726855318.28180: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855318.28194: Calling all_plugins_play to load vars for managed_node3 30582 1726855318.28197: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855318.28200: Calling groups_plugins_play to load vars for managed_node3 30582 1726855318.29408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855318.31033: done with get_vars() 30582 1726855318.31059: done getting variables 30582 1726855318.31110: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:01:58 -0400 (0:00:00.055) 0:00:54.661 ****** 30582 1726855318.31137: entering _queue_task() for managed_node3/copy 30582 1726855318.31406: worker is 1 (out of 1 available) 30582 1726855318.31422: exiting _queue_task() for managed_node3/copy 30582 1726855318.31433: done queuing things up, now waiting for results queue to drain 30582 1726855318.31434: waiting for pending results... 30582 1726855318.31618: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855318.31729: in run() - task 0affcc66-ac2b-aa83-7d57-000000001104 30582 1726855318.31740: variable 'ansible_search_path' from source: unknown 30582 1726855318.31743: variable 'ansible_search_path' from source: unknown 30582 1726855318.31772: calling self._execute() 30582 1726855318.31850: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855318.31854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855318.31861: variable 'omit' from source: magic vars 30582 1726855318.32138: variable 'ansible_distribution_major_version' from source: facts 30582 1726855318.32146: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855318.32229: variable 'network_provider' from source: set_fact 30582 1726855318.32234: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855318.32236: when evaluation is False, skipping this task 30582 1726855318.32241: _execute() done 30582 1726855318.32243: dumping result to json 30582 1726855318.32245: done dumping result, returning 30582 1726855318.32254: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-000000001104] 30582 1726855318.32259: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001104 30582 1726855318.32353: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001104 30582 1726855318.32356: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855318.32406: no more pending results, returning what we have 30582 1726855318.32410: results queue empty 30582 1726855318.32411: checking for any_errors_fatal 30582 1726855318.32419: done checking for any_errors_fatal 30582 1726855318.32419: checking for max_fail_percentage 30582 1726855318.32422: done checking for max_fail_percentage 30582 1726855318.32422: checking to see if all hosts have failed and the running result is not ok 30582 1726855318.32423: done checking to see if all hosts have failed 30582 1726855318.32424: getting the remaining hosts for this loop 30582 1726855318.32426: done getting the remaining hosts for this loop 30582 1726855318.32430: getting the next task for host managed_node3 30582 1726855318.32438: done getting next task for host managed_node3 30582 1726855318.32441: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855318.32448: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855318.32473: getting variables 30582 1726855318.32474: in VariableManager get_vars() 30582 1726855318.32516: Calling all_inventory to load vars for managed_node3 30582 1726855318.32519: Calling groups_inventory to load vars for managed_node3 30582 1726855318.32521: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855318.32530: Calling all_plugins_play to load vars for managed_node3 30582 1726855318.32532: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855318.32535: Calling groups_plugins_play to load vars for managed_node3 30582 1726855318.33745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855318.34984: done with get_vars() 30582 1726855318.35004: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:01:58 -0400 (0:00:00.039) 0:00:54.700 ****** 30582 1726855318.35066: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855318.35322: worker is 1 (out of 1 available) 30582 1726855318.35337: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855318.35348: done queuing things up, now waiting for results queue to drain 30582 1726855318.35350: waiting for pending results... 30582 1726855318.35537: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855318.35624: in run() - task 0affcc66-ac2b-aa83-7d57-000000001105 30582 1726855318.35636: variable 'ansible_search_path' from source: unknown 30582 1726855318.35640: variable 'ansible_search_path' from source: unknown 30582 1726855318.35666: calling self._execute() 30582 1726855318.35741: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855318.35745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855318.35754: variable 'omit' from source: magic vars 30582 1726855318.36104: variable 'ansible_distribution_major_version' from source: facts 30582 1726855318.36294: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855318.36297: variable 'omit' from source: magic vars 30582 1726855318.36299: variable 'omit' from source: magic vars 30582 1726855318.36354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855318.38356: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855318.38411: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855318.38438: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855318.38464: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855318.38489: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855318.38551: variable 'network_provider' from source: set_fact 30582 1726855318.38651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855318.38674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855318.38695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855318.38721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855318.38732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855318.38785: variable 'omit' from source: magic vars 30582 1726855318.38867: variable 'omit' from source: magic vars 30582 1726855318.38942: variable 'network_connections' from source: include params 30582 1726855318.38952: variable 'interface' from source: play vars 30582 1726855318.38998: variable 'interface' from source: play vars 30582 1726855318.39109: variable 'omit' from source: magic vars 30582 1726855318.39116: variable '__lsr_ansible_managed' from source: task vars 30582 1726855318.39162: variable '__lsr_ansible_managed' from source: task vars 30582 1726855318.39297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855318.39434: Loaded config def from plugin (lookup/template) 30582 1726855318.39438: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855318.39461: File lookup term: get_ansible_managed.j2 30582 1726855318.39464: variable 'ansible_search_path' from source: unknown 30582 1726855318.39469: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855318.39481: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855318.39496: variable 'ansible_search_path' from source: unknown 30582 1726855318.43795: variable 'ansible_managed' from source: unknown 30582 1726855318.43895: variable 'omit' from source: magic vars 30582 1726855318.44024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855318.44028: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855318.44031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855318.44033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855318.44035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855318.44037: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855318.44040: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855318.44047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855318.44117: Set connection var ansible_timeout to 10 30582 1726855318.44120: Set connection var ansible_connection to ssh 30582 1726855318.44125: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855318.44130: Set connection var ansible_pipelining to False 30582 1726855318.44134: Set connection var ansible_shell_executable to /bin/sh 30582 1726855318.44137: Set connection var ansible_shell_type to sh 30582 1726855318.44157: variable 'ansible_shell_executable' from source: unknown 30582 1726855318.44159: variable 'ansible_connection' from source: unknown 30582 1726855318.44162: variable 'ansible_module_compression' from source: unknown 30582 1726855318.44164: variable 'ansible_shell_type' from source: unknown 30582 1726855318.44166: variable 'ansible_shell_executable' from source: unknown 30582 1726855318.44171: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855318.44173: variable 'ansible_pipelining' from source: unknown 30582 1726855318.44179: variable 'ansible_timeout' from source: unknown 30582 1726855318.44181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855318.44333: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855318.44349: variable 'omit' from source: magic vars 30582 1726855318.44352: starting attempt loop 30582 1726855318.44355: running the handler 30582 1726855318.44357: _low_level_execute_command(): starting 30582 1726855318.44359: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855318.45134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855318.45139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855318.45150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855318.45167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855318.45199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855318.45303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855318.47000: stdout chunk (state=3): >>>/root <<< 30582 1726855318.47106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855318.47140: stderr chunk (state=3): >>><<< 30582 1726855318.47144: stdout chunk (state=3): >>><<< 30582 1726855318.47165: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855318.47178: _low_level_execute_command(): starting 30582 1726855318.47190: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613 `" && echo ansible-tmp-1726855318.4716573-33201-210861524434613="` echo /root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613 `" ) && sleep 0' 30582 1726855318.47904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855318.47975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855318.47998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855318.48020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855318.48108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855318.50079: stdout chunk (state=3): >>>ansible-tmp-1726855318.4716573-33201-210861524434613=/root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613 <<< 30582 1726855318.50202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855318.50226: stderr chunk (state=3): >>><<< 30582 1726855318.50229: stdout chunk (state=3): >>><<< 30582 1726855318.50245: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855318.4716573-33201-210861524434613=/root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855318.50284: variable 'ansible_module_compression' from source: unknown 30582 1726855318.50325: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855318.50392: variable 'ansible_facts' from source: unknown 30582 1726855318.50551: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/AnsiballZ_network_connections.py 30582 1726855318.50918: Sending initial data 30582 1726855318.50924: Sent initial data (168 bytes) 30582 1726855318.51235: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855318.51246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855318.51258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855318.51275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855318.51292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855318.51304: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855318.51308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855318.51337: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855318.51340: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855318.51343: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855318.51345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855318.51516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855318.51520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855318.51522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855318.51524: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855318.51526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855318.51528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855318.51530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855318.51805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855318.53343: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855318.53400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855318.53501: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpk3nnxjuy /root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/AnsiballZ_network_connections.py <<< 30582 1726855318.53504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/AnsiballZ_network_connections.py" <<< 30582 1726855318.53591: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpk3nnxjuy" to remote "/root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/AnsiballZ_network_connections.py" <<< 30582 1726855318.55167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855318.55170: stdout chunk (state=3): >>><<< 30582 1726855318.55173: stderr chunk (state=3): >>><<< 30582 1726855318.55175: done transferring module to remote 30582 1726855318.55177: _low_level_execute_command(): starting 30582 1726855318.55179: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/ /root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/AnsiballZ_network_connections.py && sleep 0' 30582 1726855318.55798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855318.55822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855318.55833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855318.55876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855318.55895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855318.55951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855318.57889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855318.57893: stdout chunk (state=3): >>><<< 30582 1726855318.57895: stderr chunk (state=3): >>><<< 30582 1726855318.58010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855318.58014: _low_level_execute_command(): starting 30582 1726855318.58016: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/AnsiballZ_network_connections.py && sleep 0' 30582 1726855318.58756: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855318.58811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855318.58815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855318.58819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855318.58899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855318.88008: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855318.90389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855318.90420: stdout chunk (state=3): >>><<< 30582 1726855318.90450: stderr chunk (state=3): >>><<< 30582 1726855318.90479: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855318.90551: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855318.90572: _low_level_execute_command(): starting 30582 1726855318.90590: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855318.4716573-33201-210861524434613/ > /dev/null 2>&1 && sleep 0' 30582 1726855318.91333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855318.91349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855318.91375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855318.91402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855318.91465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855318.91493: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855318.91586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855318.91618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855318.91750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855318.93849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855318.93932: stderr chunk (state=3): >>><<< 30582 1726855318.93943: stdout chunk (state=3): >>><<< 30582 1726855318.94049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855318.94055: handler run complete 30582 1726855318.94095: attempt loop complete, returning result 30582 1726855318.94102: _execute() done 30582 1726855318.94105: dumping result to json 30582 1726855318.94107: done dumping result, returning 30582 1726855318.94193: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-000000001105] 30582 1726855318.94196: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001105 30582 1726855318.94268: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001105 30582 1726855318.94272: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0 30582 1726855318.94424: no more pending results, returning what we have 30582 1726855318.94429: results queue empty 30582 1726855318.94430: checking for any_errors_fatal 30582 1726855318.94436: done checking for any_errors_fatal 30582 1726855318.94437: checking for max_fail_percentage 30582 1726855318.94440: done checking for max_fail_percentage 30582 1726855318.94441: checking to see if all hosts have failed and the running result is not ok 30582 1726855318.94442: done checking to see if all hosts have failed 30582 1726855318.94442: getting the remaining hosts for this loop 30582 1726855318.94444: done getting the remaining hosts for this loop 30582 1726855318.94568: getting the next task for host managed_node3 30582 1726855318.94580: done getting next task for host managed_node3 30582 1726855318.94584: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855318.94591: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855318.94606: getting variables 30582 1726855318.94607: in VariableManager get_vars() 30582 1726855318.94643: Calling all_inventory to load vars for managed_node3 30582 1726855318.94646: Calling groups_inventory to load vars for managed_node3 30582 1726855318.94648: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855318.94659: Calling all_plugins_play to load vars for managed_node3 30582 1726855318.94662: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855318.94806: Calling groups_plugins_play to load vars for managed_node3 30582 1726855318.96892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855318.98617: done with get_vars() 30582 1726855318.98655: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:01:58 -0400 (0:00:00.636) 0:00:55.337 ****** 30582 1726855318.98764: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855318.99174: worker is 1 (out of 1 available) 30582 1726855318.99191: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855318.99204: done queuing things up, now waiting for results queue to drain 30582 1726855318.99206: waiting for pending results... 30582 1726855318.99504: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855318.99580: in run() - task 0affcc66-ac2b-aa83-7d57-000000001106 30582 1726855318.99605: variable 'ansible_search_path' from source: unknown 30582 1726855318.99613: variable 'ansible_search_path' from source: unknown 30582 1726855318.99654: calling self._execute() 30582 1726855318.99764: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855318.99779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855318.99798: variable 'omit' from source: magic vars 30582 1726855319.00148: variable 'ansible_distribution_major_version' from source: facts 30582 1726855319.00292: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855319.00295: variable 'network_state' from source: role '' defaults 30582 1726855319.00297: Evaluated conditional (network_state != {}): False 30582 1726855319.00304: when evaluation is False, skipping this task 30582 1726855319.00309: _execute() done 30582 1726855319.00316: dumping result to json 30582 1726855319.00323: done dumping result, returning 30582 1726855319.00335: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-000000001106] 30582 1726855319.00345: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001106 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855319.00505: no more pending results, returning what we have 30582 1726855319.00509: results queue empty 30582 1726855319.00510: checking for any_errors_fatal 30582 1726855319.00522: done checking for any_errors_fatal 30582 1726855319.00522: checking for max_fail_percentage 30582 1726855319.00524: done checking for max_fail_percentage 30582 1726855319.00525: checking to see if all hosts have failed and the running result is not ok 30582 1726855319.00526: done checking to see if all hosts have failed 30582 1726855319.00527: getting the remaining hosts for this loop 30582 1726855319.00528: done getting the remaining hosts for this loop 30582 1726855319.00532: getting the next task for host managed_node3 30582 1726855319.00540: done getting next task for host managed_node3 30582 1726855319.00544: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855319.00554: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855319.00576: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001106 30582 1726855319.00580: WORKER PROCESS EXITING 30582 1726855319.00596: getting variables 30582 1726855319.00598: in VariableManager get_vars() 30582 1726855319.00636: Calling all_inventory to load vars for managed_node3 30582 1726855319.00638: Calling groups_inventory to load vars for managed_node3 30582 1726855319.00640: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855319.00652: Calling all_plugins_play to load vars for managed_node3 30582 1726855319.00656: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855319.00660: Calling groups_plugins_play to load vars for managed_node3 30582 1726855319.02246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855319.03804: done with get_vars() 30582 1726855319.03837: done getting variables 30582 1726855319.03909: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:01:59 -0400 (0:00:00.051) 0:00:55.389 ****** 30582 1726855319.03945: entering _queue_task() for managed_node3/debug 30582 1726855319.04427: worker is 1 (out of 1 available) 30582 1726855319.04441: exiting _queue_task() for managed_node3/debug 30582 1726855319.04453: done queuing things up, now waiting for results queue to drain 30582 1726855319.04455: waiting for pending results... 30582 1726855319.04691: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855319.04994: in run() - task 0affcc66-ac2b-aa83-7d57-000000001107 30582 1726855319.04998: variable 'ansible_search_path' from source: unknown 30582 1726855319.05001: variable 'ansible_search_path' from source: unknown 30582 1726855319.05004: calling self._execute() 30582 1726855319.05028: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.05040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.05056: variable 'omit' from source: magic vars 30582 1726855319.05434: variable 'ansible_distribution_major_version' from source: facts 30582 1726855319.05452: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855319.05465: variable 'omit' from source: magic vars 30582 1726855319.05534: variable 'omit' from source: magic vars 30582 1726855319.05575: variable 'omit' from source: magic vars 30582 1726855319.05624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855319.05664: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855319.05694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855319.05717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855319.05733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855319.05768: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855319.05781: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.05792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.05908: Set connection var ansible_timeout to 10 30582 1726855319.05917: Set connection var ansible_connection to ssh 30582 1726855319.05927: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855319.05934: Set connection var ansible_pipelining to False 30582 1726855319.05940: Set connection var ansible_shell_executable to /bin/sh 30582 1726855319.05991: Set connection var ansible_shell_type to sh 30582 1726855319.05994: variable 'ansible_shell_executable' from source: unknown 30582 1726855319.05996: variable 'ansible_connection' from source: unknown 30582 1726855319.05998: variable 'ansible_module_compression' from source: unknown 30582 1726855319.05999: variable 'ansible_shell_type' from source: unknown 30582 1726855319.06001: variable 'ansible_shell_executable' from source: unknown 30582 1726855319.06002: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.06004: variable 'ansible_pipelining' from source: unknown 30582 1726855319.06005: variable 'ansible_timeout' from source: unknown 30582 1726855319.06007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.06139: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855319.06156: variable 'omit' from source: magic vars 30582 1726855319.06165: starting attempt loop 30582 1726855319.06174: running the handler 30582 1726855319.06312: variable '__network_connections_result' from source: set_fact 30582 1726855319.06592: handler run complete 30582 1726855319.06596: attempt loop complete, returning result 30582 1726855319.06599: _execute() done 30582 1726855319.06602: dumping result to json 30582 1726855319.06605: done dumping result, returning 30582 1726855319.06609: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000001107] 30582 1726855319.06612: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001107 30582 1726855319.06689: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001107 30582 1726855319.06694: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0" ] } 30582 1726855319.06759: no more pending results, returning what we have 30582 1726855319.06762: results queue empty 30582 1726855319.06763: checking for any_errors_fatal 30582 1726855319.06769: done checking for any_errors_fatal 30582 1726855319.06769: checking for max_fail_percentage 30582 1726855319.06771: done checking for max_fail_percentage 30582 1726855319.06772: checking to see if all hosts have failed and the running result is not ok 30582 1726855319.06773: done checking to see if all hosts have failed 30582 1726855319.06773: getting the remaining hosts for this loop 30582 1726855319.06775: done getting the remaining hosts for this loop 30582 1726855319.06778: getting the next task for host managed_node3 30582 1726855319.06785: done getting next task for host managed_node3 30582 1726855319.06791: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855319.06795: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855319.06806: getting variables 30582 1726855319.06808: in VariableManager get_vars() 30582 1726855319.06842: Calling all_inventory to load vars for managed_node3 30582 1726855319.06844: Calling groups_inventory to load vars for managed_node3 30582 1726855319.06846: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855319.06854: Calling all_plugins_play to load vars for managed_node3 30582 1726855319.06856: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855319.06858: Calling groups_plugins_play to load vars for managed_node3 30582 1726855319.10186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855319.12016: done with get_vars() 30582 1726855319.12050: done getting variables 30582 1726855319.12125: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:01:59 -0400 (0:00:00.082) 0:00:55.471 ****** 30582 1726855319.12169: entering _queue_task() for managed_node3/debug 30582 1726855319.12556: worker is 1 (out of 1 available) 30582 1726855319.12569: exiting _queue_task() for managed_node3/debug 30582 1726855319.12583: done queuing things up, now waiting for results queue to drain 30582 1726855319.12585: waiting for pending results... 30582 1726855319.12929: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855319.13194: in run() - task 0affcc66-ac2b-aa83-7d57-000000001108 30582 1726855319.13199: variable 'ansible_search_path' from source: unknown 30582 1726855319.13202: variable 'ansible_search_path' from source: unknown 30582 1726855319.13205: calling self._execute() 30582 1726855319.13296: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.13300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.13303: variable 'omit' from source: magic vars 30582 1726855319.14064: variable 'ansible_distribution_major_version' from source: facts 30582 1726855319.14079: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855319.14086: variable 'omit' from source: magic vars 30582 1726855319.14146: variable 'omit' from source: magic vars 30582 1726855319.14184: variable 'omit' from source: magic vars 30582 1726855319.14437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855319.14473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855319.14599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855319.14622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855319.14635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855319.14692: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855319.14695: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.14698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.14942: Set connection var ansible_timeout to 10 30582 1726855319.15000: Set connection var ansible_connection to ssh 30582 1726855319.15006: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855319.15012: Set connection var ansible_pipelining to False 30582 1726855319.15018: Set connection var ansible_shell_executable to /bin/sh 30582 1726855319.15020: Set connection var ansible_shell_type to sh 30582 1726855319.15046: variable 'ansible_shell_executable' from source: unknown 30582 1726855319.15049: variable 'ansible_connection' from source: unknown 30582 1726855319.15052: variable 'ansible_module_compression' from source: unknown 30582 1726855319.15057: variable 'ansible_shell_type' from source: unknown 30582 1726855319.15400: variable 'ansible_shell_executable' from source: unknown 30582 1726855319.15403: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.15405: variable 'ansible_pipelining' from source: unknown 30582 1726855319.15408: variable 'ansible_timeout' from source: unknown 30582 1726855319.15410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.15412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855319.15415: variable 'omit' from source: magic vars 30582 1726855319.15417: starting attempt loop 30582 1726855319.15419: running the handler 30582 1726855319.15421: variable '__network_connections_result' from source: set_fact 30582 1726855319.15510: variable '__network_connections_result' from source: set_fact 30582 1726855319.15641: handler run complete 30582 1726855319.15668: attempt loop complete, returning result 30582 1726855319.15671: _execute() done 30582 1726855319.15673: dumping result to json 30582 1726855319.15698: done dumping result, returning 30582 1726855319.15701: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000001108] 30582 1726855319.15703: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001108 30582 1726855319.15802: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001108 30582 1726855319.15805: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0" ] } } 30582 1726855319.15916: no more pending results, returning what we have 30582 1726855319.15920: results queue empty 30582 1726855319.15921: checking for any_errors_fatal 30582 1726855319.15929: done checking for any_errors_fatal 30582 1726855319.15930: checking for max_fail_percentage 30582 1726855319.15932: done checking for max_fail_percentage 30582 1726855319.15933: checking to see if all hosts have failed and the running result is not ok 30582 1726855319.15934: done checking to see if all hosts have failed 30582 1726855319.15935: getting the remaining hosts for this loop 30582 1726855319.15937: done getting the remaining hosts for this loop 30582 1726855319.15941: getting the next task for host managed_node3 30582 1726855319.15951: done getting next task for host managed_node3 30582 1726855319.15955: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855319.15961: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855319.15980: getting variables 30582 1726855319.15983: in VariableManager get_vars() 30582 1726855319.16034: Calling all_inventory to load vars for managed_node3 30582 1726855319.16037: Calling groups_inventory to load vars for managed_node3 30582 1726855319.16040: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855319.16052: Calling all_plugins_play to load vars for managed_node3 30582 1726855319.16056: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855319.16059: Calling groups_plugins_play to load vars for managed_node3 30582 1726855319.17921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855319.20109: done with get_vars() 30582 1726855319.20149: done getting variables 30582 1726855319.20221: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:01:59 -0400 (0:00:00.080) 0:00:55.552 ****** 30582 1726855319.20261: entering _queue_task() for managed_node3/debug 30582 1726855319.21029: worker is 1 (out of 1 available) 30582 1726855319.21045: exiting _queue_task() for managed_node3/debug 30582 1726855319.21060: done queuing things up, now waiting for results queue to drain 30582 1726855319.21062: waiting for pending results... 30582 1726855319.21467: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855319.21630: in run() - task 0affcc66-ac2b-aa83-7d57-000000001109 30582 1726855319.21635: variable 'ansible_search_path' from source: unknown 30582 1726855319.21638: variable 'ansible_search_path' from source: unknown 30582 1726855319.21663: calling self._execute() 30582 1726855319.21771: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.21780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.21791: variable 'omit' from source: magic vars 30582 1726855319.22203: variable 'ansible_distribution_major_version' from source: facts 30582 1726855319.22215: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855319.22359: variable 'network_state' from source: role '' defaults 30582 1726855319.22363: Evaluated conditional (network_state != {}): False 30582 1726855319.22365: when evaluation is False, skipping this task 30582 1726855319.22368: _execute() done 30582 1726855319.22371: dumping result to json 30582 1726855319.22373: done dumping result, returning 30582 1726855319.22562: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-000000001109] 30582 1726855319.22565: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001109 30582 1726855319.22734: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001109 30582 1726855319.22738: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855319.22783: no more pending results, returning what we have 30582 1726855319.22788: results queue empty 30582 1726855319.22789: checking for any_errors_fatal 30582 1726855319.22797: done checking for any_errors_fatal 30582 1726855319.22798: checking for max_fail_percentage 30582 1726855319.22800: done checking for max_fail_percentage 30582 1726855319.22801: checking to see if all hosts have failed and the running result is not ok 30582 1726855319.22801: done checking to see if all hosts have failed 30582 1726855319.22802: getting the remaining hosts for this loop 30582 1726855319.22803: done getting the remaining hosts for this loop 30582 1726855319.22807: getting the next task for host managed_node3 30582 1726855319.22815: done getting next task for host managed_node3 30582 1726855319.22819: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855319.22824: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855319.22847: getting variables 30582 1726855319.22848: in VariableManager get_vars() 30582 1726855319.22889: Calling all_inventory to load vars for managed_node3 30582 1726855319.22892: Calling groups_inventory to load vars for managed_node3 30582 1726855319.22895: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855319.22904: Calling all_plugins_play to load vars for managed_node3 30582 1726855319.22907: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855319.22910: Calling groups_plugins_play to load vars for managed_node3 30582 1726855319.24990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855319.25983: done with get_vars() 30582 1726855319.26003: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:01:59 -0400 (0:00:00.058) 0:00:55.610 ****** 30582 1726855319.26079: entering _queue_task() for managed_node3/ping 30582 1726855319.26348: worker is 1 (out of 1 available) 30582 1726855319.26363: exiting _queue_task() for managed_node3/ping 30582 1726855319.26375: done queuing things up, now waiting for results queue to drain 30582 1726855319.26377: waiting for pending results... 30582 1726855319.26717: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855319.26785: in run() - task 0affcc66-ac2b-aa83-7d57-00000000110a 30582 1726855319.26844: variable 'ansible_search_path' from source: unknown 30582 1726855319.27194: variable 'ansible_search_path' from source: unknown 30582 1726855319.27201: calling self._execute() 30582 1726855319.27204: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.27207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.27209: variable 'omit' from source: magic vars 30582 1726855319.27982: variable 'ansible_distribution_major_version' from source: facts 30582 1726855319.28005: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855319.28018: variable 'omit' from source: magic vars 30582 1726855319.28097: variable 'omit' from source: magic vars 30582 1726855319.28134: variable 'omit' from source: magic vars 30582 1726855319.28167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855319.28201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855319.28218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855319.28231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855319.28241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855319.28272: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855319.28276: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.28278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.28354: Set connection var ansible_timeout to 10 30582 1726855319.28357: Set connection var ansible_connection to ssh 30582 1726855319.28362: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855319.28367: Set connection var ansible_pipelining to False 30582 1726855319.28375: Set connection var ansible_shell_executable to /bin/sh 30582 1726855319.28377: Set connection var ansible_shell_type to sh 30582 1726855319.28399: variable 'ansible_shell_executable' from source: unknown 30582 1726855319.28402: variable 'ansible_connection' from source: unknown 30582 1726855319.28410: variable 'ansible_module_compression' from source: unknown 30582 1726855319.28415: variable 'ansible_shell_type' from source: unknown 30582 1726855319.28417: variable 'ansible_shell_executable' from source: unknown 30582 1726855319.28420: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.28422: variable 'ansible_pipelining' from source: unknown 30582 1726855319.28424: variable 'ansible_timeout' from source: unknown 30582 1726855319.28425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.28574: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855319.28582: variable 'omit' from source: magic vars 30582 1726855319.28589: starting attempt loop 30582 1726855319.28592: running the handler 30582 1726855319.28606: _low_level_execute_command(): starting 30582 1726855319.28612: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855319.29129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855319.29134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855319.29137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855319.29139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.29182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855319.29186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855319.29271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855319.31056: stdout chunk (state=3): >>>/root <<< 30582 1726855319.31084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855319.31118: stderr chunk (state=3): >>><<< 30582 1726855319.31122: stdout chunk (state=3): >>><<< 30582 1726855319.31147: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855319.31162: _low_level_execute_command(): starting 30582 1726855319.31170: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487 `" && echo ansible-tmp-1726855319.3114772-33253-122875084326487="` echo /root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487 `" ) && sleep 0' 30582 1726855319.31838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855319.31842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855319.31870: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855319.31873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.31884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855319.31886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.31933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855319.31937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855319.31948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855319.32014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855319.33997: stdout chunk (state=3): >>>ansible-tmp-1726855319.3114772-33253-122875084326487=/root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487 <<< 30582 1726855319.34141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855319.34146: stdout chunk (state=3): >>><<< 30582 1726855319.34595: stderr chunk (state=3): >>><<< 30582 1726855319.34599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855319.3114772-33253-122875084326487=/root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855319.34602: variable 'ansible_module_compression' from source: unknown 30582 1726855319.34612: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855319.34693: variable 'ansible_facts' from source: unknown 30582 1726855319.34993: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/AnsiballZ_ping.py 30582 1726855319.35140: Sending initial data 30582 1726855319.35148: Sent initial data (153 bytes) 30582 1726855319.35682: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855319.35699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.35712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.35762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855319.35774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855319.35851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855319.37617: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855319.37838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855319.37902: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp0ykw0oqi /root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/AnsiballZ_ping.py <<< 30582 1726855319.37913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/AnsiballZ_ping.py" <<< 30582 1726855319.37968: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp0ykw0oqi" to remote "/root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/AnsiballZ_ping.py" <<< 30582 1726855319.38702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855319.38753: stderr chunk (state=3): >>><<< 30582 1726855319.38757: stdout chunk (state=3): >>><<< 30582 1726855319.38800: done transferring module to remote 30582 1726855319.38809: _low_level_execute_command(): starting 30582 1726855319.38813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/ /root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/AnsiballZ_ping.py && sleep 0' 30582 1726855319.39259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855319.39263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.39265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855319.39268: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855319.39274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.39316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855319.39320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855319.39398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855319.41250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855319.41279: stderr chunk (state=3): >>><<< 30582 1726855319.41283: stdout chunk (state=3): >>><<< 30582 1726855319.41314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855319.41317: _low_level_execute_command(): starting 30582 1726855319.41335: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/AnsiballZ_ping.py && sleep 0' 30582 1726855319.41981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855319.41985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855319.41989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.42009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.42100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855319.42189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855319.57675: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855319.58850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855319.58868: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855319.58927: stderr chunk (state=3): >>><<< 30582 1726855319.58946: stdout chunk (state=3): >>><<< 30582 1726855319.58976: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855319.59012: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855319.59029: _low_level_execute_command(): starting 30582 1726855319.59055: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855319.3114772-33253-122875084326487/ > /dev/null 2>&1 && sleep 0' 30582 1726855319.60358: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855319.60371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855319.60391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855319.60464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855319.60508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855319.60530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855319.60551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855319.60656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855319.62647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855319.62651: stdout chunk (state=3): >>><<< 30582 1726855319.62653: stderr chunk (state=3): >>><<< 30582 1726855319.62795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855319.62799: handler run complete 30582 1726855319.62801: attempt loop complete, returning result 30582 1726855319.62804: _execute() done 30582 1726855319.62807: dumping result to json 30582 1726855319.62809: done dumping result, returning 30582 1726855319.62811: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-00000000110a] 30582 1726855319.62813: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000110a 30582 1726855319.62884: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000110a 30582 1726855319.62890: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855319.62953: no more pending results, returning what we have 30582 1726855319.62956: results queue empty 30582 1726855319.62957: checking for any_errors_fatal 30582 1726855319.62964: done checking for any_errors_fatal 30582 1726855319.62965: checking for max_fail_percentage 30582 1726855319.62967: done checking for max_fail_percentage 30582 1726855319.62968: checking to see if all hosts have failed and the running result is not ok 30582 1726855319.62968: done checking to see if all hosts have failed 30582 1726855319.62969: getting the remaining hosts for this loop 30582 1726855319.62973: done getting the remaining hosts for this loop 30582 1726855319.62976: getting the next task for host managed_node3 30582 1726855319.63032: done getting next task for host managed_node3 30582 1726855319.63035: ^ task is: TASK: meta (role_complete) 30582 1726855319.63040: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855319.63054: getting variables 30582 1726855319.63056: in VariableManager get_vars() 30582 1726855319.63218: Calling all_inventory to load vars for managed_node3 30582 1726855319.63220: Calling groups_inventory to load vars for managed_node3 30582 1726855319.63223: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855319.63231: Calling all_plugins_play to load vars for managed_node3 30582 1726855319.63234: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855319.63237: Calling groups_plugins_play to load vars for managed_node3 30582 1726855319.65059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855319.66717: done with get_vars() 30582 1726855319.66754: done getting variables 30582 1726855319.66860: done queuing things up, now waiting for results queue to drain 30582 1726855319.66863: results queue empty 30582 1726855319.66864: checking for any_errors_fatal 30582 1726855319.66867: done checking for any_errors_fatal 30582 1726855319.66868: checking for max_fail_percentage 30582 1726855319.66869: done checking for max_fail_percentage 30582 1726855319.66870: checking to see if all hosts have failed and the running result is not ok 30582 1726855319.66873: done checking to see if all hosts have failed 30582 1726855319.66874: getting the remaining hosts for this loop 30582 1726855319.66875: done getting the remaining hosts for this loop 30582 1726855319.66878: getting the next task for host managed_node3 30582 1726855319.66883: done getting next task for host managed_node3 30582 1726855319.66886: ^ task is: TASK: Show result 30582 1726855319.66891: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855319.66894: getting variables 30582 1726855319.66895: in VariableManager get_vars() 30582 1726855319.66909: Calling all_inventory to load vars for managed_node3 30582 1726855319.66912: Calling groups_inventory to load vars for managed_node3 30582 1726855319.66921: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855319.66931: Calling all_plugins_play to load vars for managed_node3 30582 1726855319.66933: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855319.66936: Calling groups_plugins_play to load vars for managed_node3 30582 1726855319.68489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855319.78772: done with get_vars() 30582 1726855319.78816: done getting variables 30582 1726855319.78865: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 14:01:59 -0400 (0:00:00.528) 0:00:56.138 ****** 30582 1726855319.78904: entering _queue_task() for managed_node3/debug 30582 1726855319.79398: worker is 1 (out of 1 available) 30582 1726855319.79417: exiting _queue_task() for managed_node3/debug 30582 1726855319.79448: done queuing things up, now waiting for results queue to drain 30582 1726855319.79451: waiting for pending results... 30582 1726855319.79676: running TaskExecutor() for managed_node3/TASK: Show result 30582 1726855319.79983: in run() - task 0affcc66-ac2b-aa83-7d57-000000001090 30582 1726855319.79990: variable 'ansible_search_path' from source: unknown 30582 1726855319.79994: variable 'ansible_search_path' from source: unknown 30582 1726855319.79997: calling self._execute() 30582 1726855319.80001: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.80005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.80009: variable 'omit' from source: magic vars 30582 1726855319.80525: variable 'ansible_distribution_major_version' from source: facts 30582 1726855319.80544: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855319.80557: variable 'omit' from source: magic vars 30582 1726855319.80626: variable 'omit' from source: magic vars 30582 1726855319.80674: variable 'omit' from source: magic vars 30582 1726855319.80733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855319.80779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855319.80810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855319.80890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855319.80931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855319.81023: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855319.81038: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.81166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.81496: Set connection var ansible_timeout to 10 30582 1726855319.81499: Set connection var ansible_connection to ssh 30582 1726855319.81502: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855319.81504: Set connection var ansible_pipelining to False 30582 1726855319.81507: Set connection var ansible_shell_executable to /bin/sh 30582 1726855319.81509: Set connection var ansible_shell_type to sh 30582 1726855319.81511: variable 'ansible_shell_executable' from source: unknown 30582 1726855319.81603: variable 'ansible_connection' from source: unknown 30582 1726855319.81607: variable 'ansible_module_compression' from source: unknown 30582 1726855319.81610: variable 'ansible_shell_type' from source: unknown 30582 1726855319.81613: variable 'ansible_shell_executable' from source: unknown 30582 1726855319.81615: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.81617: variable 'ansible_pipelining' from source: unknown 30582 1726855319.81619: variable 'ansible_timeout' from source: unknown 30582 1726855319.81623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.81960: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855319.81964: variable 'omit' from source: magic vars 30582 1726855319.81966: starting attempt loop 30582 1726855319.81968: running the handler 30582 1726855319.82112: variable '__network_connections_result' from source: set_fact 30582 1726855319.82253: variable '__network_connections_result' from source: set_fact 30582 1726855319.82441: handler run complete 30582 1726855319.82470: attempt loop complete, returning result 30582 1726855319.82479: _execute() done 30582 1726855319.82492: dumping result to json 30582 1726855319.82524: done dumping result, returning 30582 1726855319.82537: done running TaskExecutor() for managed_node3/TASK: Show result [0affcc66-ac2b-aa83-7d57-000000001090] 30582 1726855319.82551: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001090 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0" ] } } 30582 1726855319.82949: no more pending results, returning what we have 30582 1726855319.82954: results queue empty 30582 1726855319.82955: checking for any_errors_fatal 30582 1726855319.82957: done checking for any_errors_fatal 30582 1726855319.82958: checking for max_fail_percentage 30582 1726855319.82960: done checking for max_fail_percentage 30582 1726855319.82961: checking to see if all hosts have failed and the running result is not ok 30582 1726855319.82962: done checking to see if all hosts have failed 30582 1726855319.82963: getting the remaining hosts for this loop 30582 1726855319.82965: done getting the remaining hosts for this loop 30582 1726855319.82969: getting the next task for host managed_node3 30582 1726855319.82989: done getting next task for host managed_node3 30582 1726855319.82993: ^ task is: TASK: Include network role 30582 1726855319.82997: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855319.83003: getting variables 30582 1726855319.83005: in VariableManager get_vars() 30582 1726855319.83042: Calling all_inventory to load vars for managed_node3 30582 1726855319.83046: Calling groups_inventory to load vars for managed_node3 30582 1726855319.83050: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855319.83064: Calling all_plugins_play to load vars for managed_node3 30582 1726855319.83068: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855319.83073: Calling groups_plugins_play to load vars for managed_node3 30582 1726855319.83601: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001090 30582 1726855319.83605: WORKER PROCESS EXITING 30582 1726855319.84938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855319.86863: done with get_vars() 30582 1726855319.86902: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 14:01:59 -0400 (0:00:00.080) 0:00:56.219 ****** 30582 1726855319.87011: entering _queue_task() for managed_node3/include_role 30582 1726855319.87493: worker is 1 (out of 1 available) 30582 1726855319.87505: exiting _queue_task() for managed_node3/include_role 30582 1726855319.87517: done queuing things up, now waiting for results queue to drain 30582 1726855319.87518: waiting for pending results... 30582 1726855319.87740: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855319.87940: in run() - task 0affcc66-ac2b-aa83-7d57-000000001094 30582 1726855319.87959: variable 'ansible_search_path' from source: unknown 30582 1726855319.87980: variable 'ansible_search_path' from source: unknown 30582 1726855319.88055: calling self._execute() 30582 1726855319.88161: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855319.88188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855319.88393: variable 'omit' from source: magic vars 30582 1726855319.88650: variable 'ansible_distribution_major_version' from source: facts 30582 1726855319.88673: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855319.88684: _execute() done 30582 1726855319.88694: dumping result to json 30582 1726855319.88704: done dumping result, returning 30582 1726855319.88716: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-000000001094] 30582 1726855319.88733: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001094 30582 1726855319.88963: no more pending results, returning what we have 30582 1726855319.88969: in VariableManager get_vars() 30582 1726855319.89017: Calling all_inventory to load vars for managed_node3 30582 1726855319.89021: Calling groups_inventory to load vars for managed_node3 30582 1726855319.89025: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855319.89040: Calling all_plugins_play to load vars for managed_node3 30582 1726855319.89043: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855319.89047: Calling groups_plugins_play to load vars for managed_node3 30582 1726855319.89704: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001094 30582 1726855319.89708: WORKER PROCESS EXITING 30582 1726855319.92789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855319.96249: done with get_vars() 30582 1726855319.96283: variable 'ansible_search_path' from source: unknown 30582 1726855319.96285: variable 'ansible_search_path' from source: unknown 30582 1726855319.96462: variable 'omit' from source: magic vars 30582 1726855319.96513: variable 'omit' from source: magic vars 30582 1726855319.96530: variable 'omit' from source: magic vars 30582 1726855319.96534: we have included files to process 30582 1726855319.96535: generating all_blocks data 30582 1726855319.96537: done generating all_blocks data 30582 1726855319.96541: processing included file: fedora.linux_system_roles.network 30582 1726855319.96568: in VariableManager get_vars() 30582 1726855319.96589: done with get_vars() 30582 1726855319.96621: in VariableManager get_vars() 30582 1726855319.96641: done with get_vars() 30582 1726855319.96693: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855319.96828: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855319.96927: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855319.97258: in VariableManager get_vars() 30582 1726855319.97274: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855319.99508: iterating over new_blocks loaded from include file 30582 1726855319.99511: in VariableManager get_vars() 30582 1726855319.99542: done with get_vars() 30582 1726855319.99544: filtering new block on tags 30582 1726855319.99994: done filtering new block on tags 30582 1726855319.99999: in VariableManager get_vars() 30582 1726855320.00019: done with get_vars() 30582 1726855320.00024: filtering new block on tags 30582 1726855320.00041: done filtering new block on tags 30582 1726855320.00043: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855320.00049: extending task lists for all hosts with included blocks 30582 1726855320.00241: done extending task lists 30582 1726855320.00242: done processing included files 30582 1726855320.00243: results queue empty 30582 1726855320.00244: checking for any_errors_fatal 30582 1726855320.00249: done checking for any_errors_fatal 30582 1726855320.00249: checking for max_fail_percentage 30582 1726855320.00251: done checking for max_fail_percentage 30582 1726855320.00252: checking to see if all hosts have failed and the running result is not ok 30582 1726855320.00252: done checking to see if all hosts have failed 30582 1726855320.00253: getting the remaining hosts for this loop 30582 1726855320.00254: done getting the remaining hosts for this loop 30582 1726855320.00257: getting the next task for host managed_node3 30582 1726855320.00262: done getting next task for host managed_node3 30582 1726855320.00265: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855320.00269: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855320.00285: getting variables 30582 1726855320.00286: in VariableManager get_vars() 30582 1726855320.00305: Calling all_inventory to load vars for managed_node3 30582 1726855320.00307: Calling groups_inventory to load vars for managed_node3 30582 1726855320.00309: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855320.00315: Calling all_plugins_play to load vars for managed_node3 30582 1726855320.00318: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855320.00321: Calling groups_plugins_play to load vars for managed_node3 30582 1726855320.01881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855320.03844: done with get_vars() 30582 1726855320.03903: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:00 -0400 (0:00:00.170) 0:00:56.390 ****** 30582 1726855320.04051: entering _queue_task() for managed_node3/include_tasks 30582 1726855320.04541: worker is 1 (out of 1 available) 30582 1726855320.04552: exiting _queue_task() for managed_node3/include_tasks 30582 1726855320.04563: done queuing things up, now waiting for results queue to drain 30582 1726855320.04565: waiting for pending results... 30582 1726855320.04961: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855320.05300: in run() - task 0affcc66-ac2b-aa83-7d57-00000000127a 30582 1726855320.05305: variable 'ansible_search_path' from source: unknown 30582 1726855320.05308: variable 'ansible_search_path' from source: unknown 30582 1726855320.05312: calling self._execute() 30582 1726855320.05403: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855320.05408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855320.05411: variable 'omit' from source: magic vars 30582 1726855320.06024: variable 'ansible_distribution_major_version' from source: facts 30582 1726855320.06042: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855320.06046: _execute() done 30582 1726855320.06049: dumping result to json 30582 1726855320.06080: done dumping result, returning 30582 1726855320.06091: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-00000000127a] 30582 1726855320.06096: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127a 30582 1726855320.06211: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127a 30582 1726855320.06215: WORKER PROCESS EXITING 30582 1726855320.06278: no more pending results, returning what we have 30582 1726855320.06284: in VariableManager get_vars() 30582 1726855320.06336: Calling all_inventory to load vars for managed_node3 30582 1726855320.06340: Calling groups_inventory to load vars for managed_node3 30582 1726855320.06343: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855320.06356: Calling all_plugins_play to load vars for managed_node3 30582 1726855320.06359: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855320.06362: Calling groups_plugins_play to load vars for managed_node3 30582 1726855320.08135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855320.09879: done with get_vars() 30582 1726855320.09918: variable 'ansible_search_path' from source: unknown 30582 1726855320.09920: variable 'ansible_search_path' from source: unknown 30582 1726855320.09961: we have included files to process 30582 1726855320.09963: generating all_blocks data 30582 1726855320.09964: done generating all_blocks data 30582 1726855320.09968: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855320.09969: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855320.09971: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855320.10607: done processing included file 30582 1726855320.10610: iterating over new_blocks loaded from include file 30582 1726855320.10611: in VariableManager get_vars() 30582 1726855320.10645: done with get_vars() 30582 1726855320.10648: filtering new block on tags 30582 1726855320.10686: done filtering new block on tags 30582 1726855320.10691: in VariableManager get_vars() 30582 1726855320.10714: done with get_vars() 30582 1726855320.10715: filtering new block on tags 30582 1726855320.10769: done filtering new block on tags 30582 1726855320.10773: in VariableManager get_vars() 30582 1726855320.10798: done with get_vars() 30582 1726855320.10800: filtering new block on tags 30582 1726855320.10847: done filtering new block on tags 30582 1726855320.10850: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855320.10856: extending task lists for all hosts with included blocks 30582 1726855320.12739: done extending task lists 30582 1726855320.12742: done processing included files 30582 1726855320.12743: results queue empty 30582 1726855320.12743: checking for any_errors_fatal 30582 1726855320.12746: done checking for any_errors_fatal 30582 1726855320.12747: checking for max_fail_percentage 30582 1726855320.12749: done checking for max_fail_percentage 30582 1726855320.12750: checking to see if all hosts have failed and the running result is not ok 30582 1726855320.12751: done checking to see if all hosts have failed 30582 1726855320.12751: getting the remaining hosts for this loop 30582 1726855320.12753: done getting the remaining hosts for this loop 30582 1726855320.12755: getting the next task for host managed_node3 30582 1726855320.12761: done getting next task for host managed_node3 30582 1726855320.12764: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855320.12769: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855320.12782: getting variables 30582 1726855320.12783: in VariableManager get_vars() 30582 1726855320.12804: Calling all_inventory to load vars for managed_node3 30582 1726855320.12807: Calling groups_inventory to load vars for managed_node3 30582 1726855320.12809: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855320.12815: Calling all_plugins_play to load vars for managed_node3 30582 1726855320.12817: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855320.12820: Calling groups_plugins_play to load vars for managed_node3 30582 1726855320.14101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855320.15566: done with get_vars() 30582 1726855320.15589: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:02:00 -0400 (0:00:00.116) 0:00:56.506 ****** 30582 1726855320.15651: entering _queue_task() for managed_node3/setup 30582 1726855320.15924: worker is 1 (out of 1 available) 30582 1726855320.15939: exiting _queue_task() for managed_node3/setup 30582 1726855320.15952: done queuing things up, now waiting for results queue to drain 30582 1726855320.15954: waiting for pending results... 30582 1726855320.16141: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855320.16246: in run() - task 0affcc66-ac2b-aa83-7d57-0000000012d1 30582 1726855320.16257: variable 'ansible_search_path' from source: unknown 30582 1726855320.16261: variable 'ansible_search_path' from source: unknown 30582 1726855320.16298: calling self._execute() 30582 1726855320.16370: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855320.16376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855320.16386: variable 'omit' from source: magic vars 30582 1726855320.16674: variable 'ansible_distribution_major_version' from source: facts 30582 1726855320.16686: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855320.16844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855320.18890: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855320.18950: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855320.18981: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855320.19073: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855320.19076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855320.19121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855320.19154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855320.19183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855320.19217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855320.19229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855320.19292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855320.19311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855320.19327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855320.19351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855320.19362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855320.19492: variable '__network_required_facts' from source: role '' defaults 30582 1726855320.19496: variable 'ansible_facts' from source: unknown 30582 1726855320.19960: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855320.19965: when evaluation is False, skipping this task 30582 1726855320.19968: _execute() done 30582 1726855320.19973: dumping result to json 30582 1726855320.19976: done dumping result, returning 30582 1726855320.19979: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-0000000012d1] 30582 1726855320.19982: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d1 30582 1726855320.20075: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d1 30582 1726855320.20078: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855320.20129: no more pending results, returning what we have 30582 1726855320.20132: results queue empty 30582 1726855320.20133: checking for any_errors_fatal 30582 1726855320.20135: done checking for any_errors_fatal 30582 1726855320.20136: checking for max_fail_percentage 30582 1726855320.20137: done checking for max_fail_percentage 30582 1726855320.20138: checking to see if all hosts have failed and the running result is not ok 30582 1726855320.20139: done checking to see if all hosts have failed 30582 1726855320.20140: getting the remaining hosts for this loop 30582 1726855320.20141: done getting the remaining hosts for this loop 30582 1726855320.20145: getting the next task for host managed_node3 30582 1726855320.20157: done getting next task for host managed_node3 30582 1726855320.20160: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855320.20166: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855320.20200: getting variables 30582 1726855320.20202: in VariableManager get_vars() 30582 1726855320.20241: Calling all_inventory to load vars for managed_node3 30582 1726855320.20244: Calling groups_inventory to load vars for managed_node3 30582 1726855320.20247: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855320.20256: Calling all_plugins_play to load vars for managed_node3 30582 1726855320.20260: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855320.20269: Calling groups_plugins_play to load vars for managed_node3 30582 1726855320.21119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855320.22602: done with get_vars() 30582 1726855320.22619: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:02:00 -0400 (0:00:00.070) 0:00:56.576 ****** 30582 1726855320.22697: entering _queue_task() for managed_node3/stat 30582 1726855320.22965: worker is 1 (out of 1 available) 30582 1726855320.22982: exiting _queue_task() for managed_node3/stat 30582 1726855320.22996: done queuing things up, now waiting for results queue to drain 30582 1726855320.22998: waiting for pending results... 30582 1726855320.23178: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855320.23278: in run() - task 0affcc66-ac2b-aa83-7d57-0000000012d3 30582 1726855320.23290: variable 'ansible_search_path' from source: unknown 30582 1726855320.23299: variable 'ansible_search_path' from source: unknown 30582 1726855320.23329: calling self._execute() 30582 1726855320.23409: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855320.23413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855320.23421: variable 'omit' from source: magic vars 30582 1726855320.23713: variable 'ansible_distribution_major_version' from source: facts 30582 1726855320.23722: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855320.23847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855320.24055: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855320.24090: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855320.24118: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855320.24146: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855320.24214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855320.24232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855320.24249: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855320.24268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855320.24336: variable '__network_is_ostree' from source: set_fact 30582 1726855320.24342: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855320.24345: when evaluation is False, skipping this task 30582 1726855320.24347: _execute() done 30582 1726855320.24349: dumping result to json 30582 1726855320.24354: done dumping result, returning 30582 1726855320.24361: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-0000000012d3] 30582 1726855320.24366: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d3 30582 1726855320.24458: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d3 30582 1726855320.24461: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855320.24510: no more pending results, returning what we have 30582 1726855320.24514: results queue empty 30582 1726855320.24515: checking for any_errors_fatal 30582 1726855320.24522: done checking for any_errors_fatal 30582 1726855320.24522: checking for max_fail_percentage 30582 1726855320.24524: done checking for max_fail_percentage 30582 1726855320.24525: checking to see if all hosts have failed and the running result is not ok 30582 1726855320.24526: done checking to see if all hosts have failed 30582 1726855320.24527: getting the remaining hosts for this loop 30582 1726855320.24528: done getting the remaining hosts for this loop 30582 1726855320.24532: getting the next task for host managed_node3 30582 1726855320.24542: done getting next task for host managed_node3 30582 1726855320.24545: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855320.24552: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855320.24575: getting variables 30582 1726855320.24576: in VariableManager get_vars() 30582 1726855320.24616: Calling all_inventory to load vars for managed_node3 30582 1726855320.24619: Calling groups_inventory to load vars for managed_node3 30582 1726855320.24621: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855320.24631: Calling all_plugins_play to load vars for managed_node3 30582 1726855320.24634: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855320.24636: Calling groups_plugins_play to load vars for managed_node3 30582 1726855320.25434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855320.26327: done with get_vars() 30582 1726855320.26348: done getting variables 30582 1726855320.26401: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:02:00 -0400 (0:00:00.037) 0:00:56.614 ****** 30582 1726855320.26432: entering _queue_task() for managed_node3/set_fact 30582 1726855320.26710: worker is 1 (out of 1 available) 30582 1726855320.26723: exiting _queue_task() for managed_node3/set_fact 30582 1726855320.26736: done queuing things up, now waiting for results queue to drain 30582 1726855320.26738: waiting for pending results... 30582 1726855320.26929: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855320.27039: in run() - task 0affcc66-ac2b-aa83-7d57-0000000012d4 30582 1726855320.27051: variable 'ansible_search_path' from source: unknown 30582 1726855320.27054: variable 'ansible_search_path' from source: unknown 30582 1726855320.27091: calling self._execute() 30582 1726855320.27161: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855320.27165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855320.27180: variable 'omit' from source: magic vars 30582 1726855320.27467: variable 'ansible_distribution_major_version' from source: facts 30582 1726855320.27479: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855320.27603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855320.27810: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855320.27847: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855320.27878: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855320.27904: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855320.27971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855320.27991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855320.28010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855320.28027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855320.28097: variable '__network_is_ostree' from source: set_fact 30582 1726855320.28103: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855320.28106: when evaluation is False, skipping this task 30582 1726855320.28109: _execute() done 30582 1726855320.28111: dumping result to json 30582 1726855320.28115: done dumping result, returning 30582 1726855320.28123: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-0000000012d4] 30582 1726855320.28128: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d4 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855320.28265: no more pending results, returning what we have 30582 1726855320.28269: results queue empty 30582 1726855320.28270: checking for any_errors_fatal 30582 1726855320.28277: done checking for any_errors_fatal 30582 1726855320.28278: checking for max_fail_percentage 30582 1726855320.28280: done checking for max_fail_percentage 30582 1726855320.28281: checking to see if all hosts have failed and the running result is not ok 30582 1726855320.28281: done checking to see if all hosts have failed 30582 1726855320.28282: getting the remaining hosts for this loop 30582 1726855320.28283: done getting the remaining hosts for this loop 30582 1726855320.28288: getting the next task for host managed_node3 30582 1726855320.28302: done getting next task for host managed_node3 30582 1726855320.28305: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855320.28312: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855320.28337: getting variables 30582 1726855320.28339: in VariableManager get_vars() 30582 1726855320.28378: Calling all_inventory to load vars for managed_node3 30582 1726855320.28381: Calling groups_inventory to load vars for managed_node3 30582 1726855320.28383: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855320.28401: Calling all_plugins_play to load vars for managed_node3 30582 1726855320.28405: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855320.28410: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d4 30582 1726855320.28412: WORKER PROCESS EXITING 30582 1726855320.28415: Calling groups_plugins_play to load vars for managed_node3 30582 1726855320.29366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855320.30240: done with get_vars() 30582 1726855320.30259: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:02:00 -0400 (0:00:00.038) 0:00:56.653 ****** 30582 1726855320.30334: entering _queue_task() for managed_node3/service_facts 30582 1726855320.30605: worker is 1 (out of 1 available) 30582 1726855320.30620: exiting _queue_task() for managed_node3/service_facts 30582 1726855320.30634: done queuing things up, now waiting for results queue to drain 30582 1726855320.30636: waiting for pending results... 30582 1726855320.30833: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855320.30944: in run() - task 0affcc66-ac2b-aa83-7d57-0000000012d6 30582 1726855320.30955: variable 'ansible_search_path' from source: unknown 30582 1726855320.30959: variable 'ansible_search_path' from source: unknown 30582 1726855320.30994: calling self._execute() 30582 1726855320.31061: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855320.31066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855320.31078: variable 'omit' from source: magic vars 30582 1726855320.31360: variable 'ansible_distribution_major_version' from source: facts 30582 1726855320.31365: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855320.31372: variable 'omit' from source: magic vars 30582 1726855320.31429: variable 'omit' from source: magic vars 30582 1726855320.31452: variable 'omit' from source: magic vars 30582 1726855320.31489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855320.31518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855320.31535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855320.31548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855320.31559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855320.31586: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855320.31591: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855320.31593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855320.31665: Set connection var ansible_timeout to 10 30582 1726855320.31668: Set connection var ansible_connection to ssh 30582 1726855320.31676: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855320.31680: Set connection var ansible_pipelining to False 30582 1726855320.31690: Set connection var ansible_shell_executable to /bin/sh 30582 1726855320.31693: Set connection var ansible_shell_type to sh 30582 1726855320.31707: variable 'ansible_shell_executable' from source: unknown 30582 1726855320.31709: variable 'ansible_connection' from source: unknown 30582 1726855320.31712: variable 'ansible_module_compression' from source: unknown 30582 1726855320.31714: variable 'ansible_shell_type' from source: unknown 30582 1726855320.31716: variable 'ansible_shell_executable' from source: unknown 30582 1726855320.31719: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855320.31723: variable 'ansible_pipelining' from source: unknown 30582 1726855320.31725: variable 'ansible_timeout' from source: unknown 30582 1726855320.31730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855320.31879: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855320.31888: variable 'omit' from source: magic vars 30582 1726855320.31895: starting attempt loop 30582 1726855320.31900: running the handler 30582 1726855320.31911: _low_level_execute_command(): starting 30582 1726855320.31917: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855320.32433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855320.32436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.32439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855320.32441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.32501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855320.32505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855320.32508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855320.32585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855320.34289: stdout chunk (state=3): >>>/root <<< 30582 1726855320.34381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855320.34415: stderr chunk (state=3): >>><<< 30582 1726855320.34419: stdout chunk (state=3): >>><<< 30582 1726855320.34443: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855320.34456: _low_level_execute_command(): starting 30582 1726855320.34463: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261 `" && echo ansible-tmp-1726855320.3444417-33306-216152713365261="` echo /root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261 `" ) && sleep 0' 30582 1726855320.34923: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855320.34928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855320.34931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.34941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855320.34943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.34991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855320.34995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855320.35059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855320.36958: stdout chunk (state=3): >>>ansible-tmp-1726855320.3444417-33306-216152713365261=/root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261 <<< 30582 1726855320.37062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855320.37096: stderr chunk (state=3): >>><<< 30582 1726855320.37099: stdout chunk (state=3): >>><<< 30582 1726855320.37116: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855320.3444417-33306-216152713365261=/root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855320.37155: variable 'ansible_module_compression' from source: unknown 30582 1726855320.37195: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855320.37231: variable 'ansible_facts' from source: unknown 30582 1726855320.37288: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/AnsiballZ_service_facts.py 30582 1726855320.37393: Sending initial data 30582 1726855320.37396: Sent initial data (162 bytes) 30582 1726855320.37853: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855320.37857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855320.37860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.37862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855320.37864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.37920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855320.37923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855320.37925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855320.37986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855320.39554: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855320.39562: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855320.39611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855320.39670: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp9w12xsco /root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/AnsiballZ_service_facts.py <<< 30582 1726855320.39674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/AnsiballZ_service_facts.py" <<< 30582 1726855320.39728: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp9w12xsco" to remote "/root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/AnsiballZ_service_facts.py" <<< 30582 1726855320.39732: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/AnsiballZ_service_facts.py" <<< 30582 1726855320.40350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855320.40397: stderr chunk (state=3): >>><<< 30582 1726855320.40400: stdout chunk (state=3): >>><<< 30582 1726855320.40416: done transferring module to remote 30582 1726855320.40425: _low_level_execute_command(): starting 30582 1726855320.40429: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/ /root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/AnsiballZ_service_facts.py && sleep 0' 30582 1726855320.40861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855320.40868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855320.40896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.40899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855320.40920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855320.40923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.40962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855320.40965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855320.40968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855320.41031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855320.42860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855320.42889: stderr chunk (state=3): >>><<< 30582 1726855320.42893: stdout chunk (state=3): >>><<< 30582 1726855320.42905: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855320.42907: _low_level_execute_command(): starting 30582 1726855320.42913: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/AnsiballZ_service_facts.py && sleep 0' 30582 1726855320.43366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855320.43369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.43376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855320.43378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855320.43430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855320.43434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855320.43439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855320.43505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855321.95759: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30582 1726855321.95776: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30582 1726855321.95865: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855321.97294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855321.97340: stderr chunk (state=3): >>><<< 30582 1726855321.97343: stdout chunk (state=3): >>><<< 30582 1726855321.97376: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855321.98194: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855321.98197: _low_level_execute_command(): starting 30582 1726855321.98200: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855320.3444417-33306-216152713365261/ > /dev/null 2>&1 && sleep 0' 30582 1726855321.98780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855321.98803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855321.98817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855321.98835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855321.98855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855321.98866: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855321.98880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855321.98902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855321.98993: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855321.99007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855321.99106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855322.01395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855322.01399: stdout chunk (state=3): >>><<< 30582 1726855322.01402: stderr chunk (state=3): >>><<< 30582 1726855322.01406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855322.01409: handler run complete 30582 1726855322.01663: variable 'ansible_facts' from source: unknown 30582 1726855322.02026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855322.02996: variable 'ansible_facts' from source: unknown 30582 1726855322.03229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855322.03504: attempt loop complete, returning result 30582 1726855322.03520: _execute() done 30582 1726855322.03534: dumping result to json 30582 1726855322.03606: done dumping result, returning 30582 1726855322.03625: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-0000000012d6] 30582 1726855322.03635: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d6 30582 1726855322.05765: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d6 30582 1726855322.05768: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855322.06043: no more pending results, returning what we have 30582 1726855322.06046: results queue empty 30582 1726855322.06047: checking for any_errors_fatal 30582 1726855322.06052: done checking for any_errors_fatal 30582 1726855322.06053: checking for max_fail_percentage 30582 1726855322.06054: done checking for max_fail_percentage 30582 1726855322.06055: checking to see if all hosts have failed and the running result is not ok 30582 1726855322.06056: done checking to see if all hosts have failed 30582 1726855322.06057: getting the remaining hosts for this loop 30582 1726855322.06058: done getting the remaining hosts for this loop 30582 1726855322.06062: getting the next task for host managed_node3 30582 1726855322.06068: done getting next task for host managed_node3 30582 1726855322.06075: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855322.06081: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855322.06095: getting variables 30582 1726855322.06097: in VariableManager get_vars() 30582 1726855322.06127: Calling all_inventory to load vars for managed_node3 30582 1726855322.06130: Calling groups_inventory to load vars for managed_node3 30582 1726855322.06132: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855322.06140: Calling all_plugins_play to load vars for managed_node3 30582 1726855322.06148: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855322.06151: Calling groups_plugins_play to load vars for managed_node3 30582 1726855322.09216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855322.10817: done with get_vars() 30582 1726855322.10850: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:02:02 -0400 (0:00:01.806) 0:00:58.459 ****** 30582 1726855322.10955: entering _queue_task() for managed_node3/package_facts 30582 1726855322.11325: worker is 1 (out of 1 available) 30582 1726855322.11337: exiting _queue_task() for managed_node3/package_facts 30582 1726855322.11350: done queuing things up, now waiting for results queue to drain 30582 1726855322.11351: waiting for pending results... 30582 1726855322.11651: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855322.11836: in run() - task 0affcc66-ac2b-aa83-7d57-0000000012d7 30582 1726855322.11859: variable 'ansible_search_path' from source: unknown 30582 1726855322.11867: variable 'ansible_search_path' from source: unknown 30582 1726855322.11909: calling self._execute() 30582 1726855322.12000: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855322.12010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855322.12026: variable 'omit' from source: magic vars 30582 1726855322.12938: variable 'ansible_distribution_major_version' from source: facts 30582 1726855322.12942: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855322.12944: variable 'omit' from source: magic vars 30582 1726855322.12946: variable 'omit' from source: magic vars 30582 1726855322.12948: variable 'omit' from source: magic vars 30582 1726855322.13074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855322.13193: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855322.13220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855322.13284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855322.13305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855322.13410: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855322.13592: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855322.13596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855322.13660: Set connection var ansible_timeout to 10 30582 1726855322.13668: Set connection var ansible_connection to ssh 30582 1726855322.13711: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855322.13722: Set connection var ansible_pipelining to False 30582 1726855322.13733: Set connection var ansible_shell_executable to /bin/sh 30582 1726855322.13918: Set connection var ansible_shell_type to sh 30582 1726855322.13920: variable 'ansible_shell_executable' from source: unknown 30582 1726855322.13923: variable 'ansible_connection' from source: unknown 30582 1726855322.13925: variable 'ansible_module_compression' from source: unknown 30582 1726855322.13927: variable 'ansible_shell_type' from source: unknown 30582 1726855322.13929: variable 'ansible_shell_executable' from source: unknown 30582 1726855322.13931: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855322.13933: variable 'ansible_pipelining' from source: unknown 30582 1726855322.13934: variable 'ansible_timeout' from source: unknown 30582 1726855322.13936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855322.14219: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855322.14365: variable 'omit' from source: magic vars 30582 1726855322.14376: starting attempt loop 30582 1726855322.14382: running the handler 30582 1726855322.14401: _low_level_execute_command(): starting 30582 1726855322.14413: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855322.15828: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855322.15879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855322.15897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855322.15919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855322.15938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855322.15951: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855322.16065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855322.16305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855322.16453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855322.18165: stdout chunk (state=3): >>>/root <<< 30582 1726855322.18305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855322.18318: stdout chunk (state=3): >>><<< 30582 1726855322.18330: stderr chunk (state=3): >>><<< 30582 1726855322.18367: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855322.18393: _low_level_execute_command(): starting 30582 1726855322.18405: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050 `" && echo ansible-tmp-1726855322.1837735-33367-185876411909050="` echo /root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050 `" ) && sleep 0' 30582 1726855322.19077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855322.19205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855322.19314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855322.19383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855322.19409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855322.19557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855322.21480: stdout chunk (state=3): >>>ansible-tmp-1726855322.1837735-33367-185876411909050=/root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050 <<< 30582 1726855322.21654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855322.21658: stdout chunk (state=3): >>><<< 30582 1726855322.21661: stderr chunk (state=3): >>><<< 30582 1726855322.21689: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855322.1837735-33367-185876411909050=/root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855322.21745: variable 'ansible_module_compression' from source: unknown 30582 1726855322.21811: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855322.21994: variable 'ansible_facts' from source: unknown 30582 1726855322.22083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/AnsiballZ_package_facts.py 30582 1726855322.22601: Sending initial data 30582 1726855322.22611: Sent initial data (162 bytes) 30582 1726855322.23628: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855322.23633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855322.23669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855322.23838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855322.23854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855322.23958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855322.25622: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855322.25644: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855322.25717: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855322.25801: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppgw4sudf /root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/AnsiballZ_package_facts.py <<< 30582 1726855322.25804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/AnsiballZ_package_facts.py" <<< 30582 1726855322.25878: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppgw4sudf" to remote "/root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/AnsiballZ_package_facts.py" <<< 30582 1726855322.27694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855322.27818: stderr chunk (state=3): >>><<< 30582 1726855322.27821: stdout chunk (state=3): >>><<< 30582 1726855322.27824: done transferring module to remote 30582 1726855322.27826: _low_level_execute_command(): starting 30582 1726855322.27828: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/ /root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/AnsiballZ_package_facts.py && sleep 0' 30582 1726855322.28595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855322.28600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855322.28603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855322.28607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855322.28614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855322.28616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855322.28794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855322.30651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855322.30709: stderr chunk (state=3): >>><<< 30582 1726855322.30719: stdout chunk (state=3): >>><<< 30582 1726855322.30738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855322.30741: _low_level_execute_command(): starting 30582 1726855322.30746: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/AnsiballZ_package_facts.py && sleep 0' 30582 1726855322.31385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855322.31507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855322.31617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855322.31628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855322.31691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855322.31786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855322.76012: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30582 1726855322.76196: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30582 1726855322.76210: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30582 1726855322.76248: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855322.78035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855322.78076: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855322.78079: stdout chunk (state=3): >>><<< 30582 1726855322.78081: stderr chunk (state=3): >>><<< 30582 1726855322.78300: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855322.80468: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855322.80515: _low_level_execute_command(): starting 30582 1726855322.80527: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855322.1837735-33367-185876411909050/ > /dev/null 2>&1 && sleep 0' 30582 1726855322.81213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855322.81232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855322.81247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855322.81286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855322.81394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855322.81419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855322.81525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855322.83449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855322.83453: stdout chunk (state=3): >>><<< 30582 1726855322.83456: stderr chunk (state=3): >>><<< 30582 1726855322.83593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855322.83597: handler run complete 30582 1726855322.84335: variable 'ansible_facts' from source: unknown 30582 1726855322.84809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855322.86844: variable 'ansible_facts' from source: unknown 30582 1726855322.87517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855322.88911: attempt loop complete, returning result 30582 1726855322.89293: _execute() done 30582 1726855322.89297: dumping result to json 30582 1726855322.89531: done dumping result, returning 30582 1726855322.89695: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-0000000012d7] 30582 1726855322.89699: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d7 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855322.93875: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000012d7 30582 1726855322.93881: WORKER PROCESS EXITING 30582 1726855322.93906: no more pending results, returning what we have 30582 1726855322.93909: results queue empty 30582 1726855322.93909: checking for any_errors_fatal 30582 1726855322.93913: done checking for any_errors_fatal 30582 1726855322.93914: checking for max_fail_percentage 30582 1726855322.93915: done checking for max_fail_percentage 30582 1726855322.93916: checking to see if all hosts have failed and the running result is not ok 30582 1726855322.93916: done checking to see if all hosts have failed 30582 1726855322.93917: getting the remaining hosts for this loop 30582 1726855322.93918: done getting the remaining hosts for this loop 30582 1726855322.93920: getting the next task for host managed_node3 30582 1726855322.93926: done getting next task for host managed_node3 30582 1726855322.93929: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855322.93932: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855322.93943: getting variables 30582 1726855322.93944: in VariableManager get_vars() 30582 1726855322.93968: Calling all_inventory to load vars for managed_node3 30582 1726855322.93970: Calling groups_inventory to load vars for managed_node3 30582 1726855322.93972: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855322.93979: Calling all_plugins_play to load vars for managed_node3 30582 1726855322.93981: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855322.93983: Calling groups_plugins_play to load vars for managed_node3 30582 1726855322.94737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855322.95701: done with get_vars() 30582 1726855322.95724: done getting variables 30582 1726855322.95779: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:02 -0400 (0:00:00.848) 0:00:59.308 ****** 30582 1726855322.95831: entering _queue_task() for managed_node3/debug 30582 1726855322.96177: worker is 1 (out of 1 available) 30582 1726855322.96195: exiting _queue_task() for managed_node3/debug 30582 1726855322.96206: done queuing things up, now waiting for results queue to drain 30582 1726855322.96207: waiting for pending results... 30582 1726855322.96508: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855322.96705: in run() - task 0affcc66-ac2b-aa83-7d57-00000000127b 30582 1726855322.96709: variable 'ansible_search_path' from source: unknown 30582 1726855322.96712: variable 'ansible_search_path' from source: unknown 30582 1726855322.96714: calling self._execute() 30582 1726855322.96757: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855322.96767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855322.96780: variable 'omit' from source: magic vars 30582 1726855322.97160: variable 'ansible_distribution_major_version' from source: facts 30582 1726855322.97177: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855322.97191: variable 'omit' from source: magic vars 30582 1726855322.97262: variable 'omit' from source: magic vars 30582 1726855322.97358: variable 'network_provider' from source: set_fact 30582 1726855322.97380: variable 'omit' from source: magic vars 30582 1726855322.97424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855322.97470: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855322.97497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855322.97517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855322.97534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855322.97569: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855322.97581: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855322.97686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855322.97705: Set connection var ansible_timeout to 10 30582 1726855322.97712: Set connection var ansible_connection to ssh 30582 1726855322.97722: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855322.97731: Set connection var ansible_pipelining to False 30582 1726855322.97741: Set connection var ansible_shell_executable to /bin/sh 30582 1726855322.97748: Set connection var ansible_shell_type to sh 30582 1726855322.97773: variable 'ansible_shell_executable' from source: unknown 30582 1726855322.97781: variable 'ansible_connection' from source: unknown 30582 1726855322.97793: variable 'ansible_module_compression' from source: unknown 30582 1726855322.97800: variable 'ansible_shell_type' from source: unknown 30582 1726855322.97805: variable 'ansible_shell_executable' from source: unknown 30582 1726855322.97812: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855322.97819: variable 'ansible_pipelining' from source: unknown 30582 1726855322.97824: variable 'ansible_timeout' from source: unknown 30582 1726855322.97830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855322.97979: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855322.97997: variable 'omit' from source: magic vars 30582 1726855322.98009: starting attempt loop 30582 1726855322.98017: running the handler 30582 1726855322.98069: handler run complete 30582 1726855322.98091: attempt loop complete, returning result 30582 1726855322.98098: _execute() done 30582 1726855322.98117: dumping result to json 30582 1726855322.98119: done dumping result, returning 30582 1726855322.98124: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-00000000127b] 30582 1726855322.98192: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127b 30582 1726855322.98492: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127b 30582 1726855322.98496: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855322.98551: no more pending results, returning what we have 30582 1726855322.98554: results queue empty 30582 1726855322.98555: checking for any_errors_fatal 30582 1726855322.98562: done checking for any_errors_fatal 30582 1726855322.98563: checking for max_fail_percentage 30582 1726855322.98565: done checking for max_fail_percentage 30582 1726855322.98566: checking to see if all hosts have failed and the running result is not ok 30582 1726855322.98566: done checking to see if all hosts have failed 30582 1726855322.98567: getting the remaining hosts for this loop 30582 1726855322.98568: done getting the remaining hosts for this loop 30582 1726855322.98572: getting the next task for host managed_node3 30582 1726855322.98579: done getting next task for host managed_node3 30582 1726855322.98583: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855322.98589: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855322.98601: getting variables 30582 1726855322.98603: in VariableManager get_vars() 30582 1726855322.98637: Calling all_inventory to load vars for managed_node3 30582 1726855322.98640: Calling groups_inventory to load vars for managed_node3 30582 1726855322.98642: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855322.98650: Calling all_plugins_play to load vars for managed_node3 30582 1726855322.98653: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855322.98656: Calling groups_plugins_play to load vars for managed_node3 30582 1726855323.00017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855323.01601: done with get_vars() 30582 1726855323.01627: done getting variables 30582 1726855323.01683: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:03 -0400 (0:00:00.058) 0:00:59.367 ****** 30582 1726855323.01728: entering _queue_task() for managed_node3/fail 30582 1726855323.02090: worker is 1 (out of 1 available) 30582 1726855323.02107: exiting _queue_task() for managed_node3/fail 30582 1726855323.02120: done queuing things up, now waiting for results queue to drain 30582 1726855323.02122: waiting for pending results... 30582 1726855323.02455: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855323.02633: in run() - task 0affcc66-ac2b-aa83-7d57-00000000127c 30582 1726855323.02656: variable 'ansible_search_path' from source: unknown 30582 1726855323.02667: variable 'ansible_search_path' from source: unknown 30582 1726855323.02717: calling self._execute() 30582 1726855323.02828: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855323.02840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855323.02855: variable 'omit' from source: magic vars 30582 1726855323.03278: variable 'ansible_distribution_major_version' from source: facts 30582 1726855323.03297: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855323.03434: variable 'network_state' from source: role '' defaults 30582 1726855323.03454: Evaluated conditional (network_state != {}): False 30582 1726855323.03463: when evaluation is False, skipping this task 30582 1726855323.03470: _execute() done 30582 1726855323.03478: dumping result to json 30582 1726855323.03484: done dumping result, returning 30582 1726855323.03497: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-00000000127c] 30582 1726855323.03507: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127c 30582 1726855323.03628: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127c 30582 1726855323.03632: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855323.03711: no more pending results, returning what we have 30582 1726855323.03715: results queue empty 30582 1726855323.03717: checking for any_errors_fatal 30582 1726855323.03723: done checking for any_errors_fatal 30582 1726855323.03723: checking for max_fail_percentage 30582 1726855323.03726: done checking for max_fail_percentage 30582 1726855323.03727: checking to see if all hosts have failed and the running result is not ok 30582 1726855323.03727: done checking to see if all hosts have failed 30582 1726855323.03728: getting the remaining hosts for this loop 30582 1726855323.03729: done getting the remaining hosts for this loop 30582 1726855323.03733: getting the next task for host managed_node3 30582 1726855323.03743: done getting next task for host managed_node3 30582 1726855323.03747: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855323.03752: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855323.03782: getting variables 30582 1726855323.03785: in VariableManager get_vars() 30582 1726855323.03829: Calling all_inventory to load vars for managed_node3 30582 1726855323.03832: Calling groups_inventory to load vars for managed_node3 30582 1726855323.03835: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855323.03848: Calling all_plugins_play to load vars for managed_node3 30582 1726855323.03851: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855323.03855: Calling groups_plugins_play to load vars for managed_node3 30582 1726855323.05541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855323.06926: done with get_vars() 30582 1726855323.06958: done getting variables 30582 1726855323.07020: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:03 -0400 (0:00:00.053) 0:00:59.420 ****** 30582 1726855323.07054: entering _queue_task() for managed_node3/fail 30582 1726855323.07419: worker is 1 (out of 1 available) 30582 1726855323.07432: exiting _queue_task() for managed_node3/fail 30582 1726855323.07444: done queuing things up, now waiting for results queue to drain 30582 1726855323.07445: waiting for pending results... 30582 1726855323.07814: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855323.07911: in run() - task 0affcc66-ac2b-aa83-7d57-00000000127d 30582 1726855323.07916: variable 'ansible_search_path' from source: unknown 30582 1726855323.07919: variable 'ansible_search_path' from source: unknown 30582 1726855323.07945: calling self._execute() 30582 1726855323.08049: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855323.08127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855323.08131: variable 'omit' from source: magic vars 30582 1726855323.08462: variable 'ansible_distribution_major_version' from source: facts 30582 1726855323.08481: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855323.08895: variable 'network_state' from source: role '' defaults 30582 1726855323.08899: Evaluated conditional (network_state != {}): False 30582 1726855323.08901: when evaluation is False, skipping this task 30582 1726855323.08903: _execute() done 30582 1726855323.08906: dumping result to json 30582 1726855323.08908: done dumping result, returning 30582 1726855323.08911: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-00000000127d] 30582 1726855323.08913: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127d 30582 1726855323.08984: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127d 30582 1726855323.08990: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855323.09199: no more pending results, returning what we have 30582 1726855323.09203: results queue empty 30582 1726855323.09204: checking for any_errors_fatal 30582 1726855323.09215: done checking for any_errors_fatal 30582 1726855323.09215: checking for max_fail_percentage 30582 1726855323.09218: done checking for max_fail_percentage 30582 1726855323.09219: checking to see if all hosts have failed and the running result is not ok 30582 1726855323.09220: done checking to see if all hosts have failed 30582 1726855323.09221: getting the remaining hosts for this loop 30582 1726855323.09223: done getting the remaining hosts for this loop 30582 1726855323.09228: getting the next task for host managed_node3 30582 1726855323.09240: done getting next task for host managed_node3 30582 1726855323.09244: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855323.09252: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855323.09284: getting variables 30582 1726855323.09288: in VariableManager get_vars() 30582 1726855323.09331: Calling all_inventory to load vars for managed_node3 30582 1726855323.09335: Calling groups_inventory to load vars for managed_node3 30582 1726855323.09337: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855323.09350: Calling all_plugins_play to load vars for managed_node3 30582 1726855323.09352: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855323.09355: Calling groups_plugins_play to load vars for managed_node3 30582 1726855323.10924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855323.13951: done with get_vars() 30582 1726855323.14296: done getting variables 30582 1726855323.14362: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:03 -0400 (0:00:00.075) 0:00:59.495 ****** 30582 1726855323.14604: entering _queue_task() for managed_node3/fail 30582 1726855323.15298: worker is 1 (out of 1 available) 30582 1726855323.15311: exiting _queue_task() for managed_node3/fail 30582 1726855323.15323: done queuing things up, now waiting for results queue to drain 30582 1726855323.15326: waiting for pending results... 30582 1726855323.16005: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855323.16011: in run() - task 0affcc66-ac2b-aa83-7d57-00000000127e 30582 1726855323.16394: variable 'ansible_search_path' from source: unknown 30582 1726855323.16397: variable 'ansible_search_path' from source: unknown 30582 1726855323.16400: calling self._execute() 30582 1726855323.16403: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855323.16405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855323.16408: variable 'omit' from source: magic vars 30582 1726855323.17117: variable 'ansible_distribution_major_version' from source: facts 30582 1726855323.17393: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855323.17473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855323.22659: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855323.22942: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855323.22981: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855323.23142: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855323.23168: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855323.23493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.23497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.23499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.23514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.23528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.23780: variable 'ansible_distribution_major_version' from source: facts 30582 1726855323.23799: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855323.24163: variable 'ansible_distribution' from source: facts 30582 1726855323.24167: variable '__network_rh_distros' from source: role '' defaults 30582 1726855323.24179: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855323.24734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.24761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.24786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.24948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.24962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.25096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.25125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.25151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.25190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.25312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.25476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.25501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.25526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.25751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.25766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.26183: variable 'network_connections' from source: include params 30582 1726855323.26217: variable 'interface' from source: play vars 30582 1726855323.26277: variable 'interface' from source: play vars 30582 1726855323.26291: variable 'network_state' from source: role '' defaults 30582 1726855323.26593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855323.26596: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855323.26791: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855323.26795: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855323.26798: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855323.26800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855323.26802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855323.26812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.26815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855323.26825: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855323.26828: when evaluation is False, skipping this task 30582 1726855323.26830: _execute() done 30582 1726855323.26833: dumping result to json 30582 1726855323.26835: done dumping result, returning 30582 1726855323.26994: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-00000000127e] 30582 1726855323.26998: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127e skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855323.27116: no more pending results, returning what we have 30582 1726855323.27120: results queue empty 30582 1726855323.27122: checking for any_errors_fatal 30582 1726855323.27127: done checking for any_errors_fatal 30582 1726855323.27128: checking for max_fail_percentage 30582 1726855323.27130: done checking for max_fail_percentage 30582 1726855323.27132: checking to see if all hosts have failed and the running result is not ok 30582 1726855323.27132: done checking to see if all hosts have failed 30582 1726855323.27133: getting the remaining hosts for this loop 30582 1726855323.27135: done getting the remaining hosts for this loop 30582 1726855323.27140: getting the next task for host managed_node3 30582 1726855323.27151: done getting next task for host managed_node3 30582 1726855323.27156: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855323.27161: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855323.27193: getting variables 30582 1726855323.27195: in VariableManager get_vars() 30582 1726855323.27235: Calling all_inventory to load vars for managed_node3 30582 1726855323.27237: Calling groups_inventory to load vars for managed_node3 30582 1726855323.27239: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855323.27250: Calling all_plugins_play to load vars for managed_node3 30582 1726855323.27253: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855323.27256: Calling groups_plugins_play to load vars for managed_node3 30582 1726855323.27797: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127e 30582 1726855323.27801: WORKER PROCESS EXITING 30582 1726855323.31011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855323.34259: done with get_vars() 30582 1726855323.34303: done getting variables 30582 1726855323.34367: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:03 -0400 (0:00:00.200) 0:00:59.695 ****** 30582 1726855323.34610: entering _queue_task() for managed_node3/dnf 30582 1726855323.35395: worker is 1 (out of 1 available) 30582 1726855323.35407: exiting _queue_task() for managed_node3/dnf 30582 1726855323.35418: done queuing things up, now waiting for results queue to drain 30582 1726855323.35420: waiting for pending results... 30582 1726855323.36507: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855323.36731: in run() - task 0affcc66-ac2b-aa83-7d57-00000000127f 30582 1726855323.36754: variable 'ansible_search_path' from source: unknown 30582 1726855323.36764: variable 'ansible_search_path' from source: unknown 30582 1726855323.37093: calling self._execute() 30582 1726855323.37493: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855323.37497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855323.37500: variable 'omit' from source: magic vars 30582 1726855323.38531: variable 'ansible_distribution_major_version' from source: facts 30582 1726855323.38548: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855323.39152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855323.44019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855323.44268: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855323.44314: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855323.44354: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855323.44792: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855323.44796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.44799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.44802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.44813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.44833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.44964: variable 'ansible_distribution' from source: facts 30582 1726855323.45199: variable 'ansible_distribution_major_version' from source: facts 30582 1726855323.45222: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855323.45340: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855323.45719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.45747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.45777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.45823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.46092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.46095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.46098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.46100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.46138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.46156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.46201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.46492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.46496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.46498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.46500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.46653: variable 'network_connections' from source: include params 30582 1726855323.46907: variable 'interface' from source: play vars 30582 1726855323.46974: variable 'interface' from source: play vars 30582 1726855323.47051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855323.47467: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855323.47509: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855323.47544: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855323.47577: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855323.47639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855323.47666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855323.47707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.47736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855323.47797: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855323.48037: variable 'network_connections' from source: include params 30582 1726855323.48048: variable 'interface' from source: play vars 30582 1726855323.48111: variable 'interface' from source: play vars 30582 1726855323.48140: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855323.48148: when evaluation is False, skipping this task 30582 1726855323.48156: _execute() done 30582 1726855323.48163: dumping result to json 30582 1726855323.48169: done dumping result, returning 30582 1726855323.48180: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000127f] 30582 1726855323.48190: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127f 30582 1726855323.48299: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000127f skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855323.48355: no more pending results, returning what we have 30582 1726855323.48359: results queue empty 30582 1726855323.48360: checking for any_errors_fatal 30582 1726855323.48367: done checking for any_errors_fatal 30582 1726855323.48368: checking for max_fail_percentage 30582 1726855323.48372: done checking for max_fail_percentage 30582 1726855323.48373: checking to see if all hosts have failed and the running result is not ok 30582 1726855323.48374: done checking to see if all hosts have failed 30582 1726855323.48375: getting the remaining hosts for this loop 30582 1726855323.48376: done getting the remaining hosts for this loop 30582 1726855323.48382: getting the next task for host managed_node3 30582 1726855323.48392: done getting next task for host managed_node3 30582 1726855323.48396: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855323.48401: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855323.48429: getting variables 30582 1726855323.48430: in VariableManager get_vars() 30582 1726855323.48472: Calling all_inventory to load vars for managed_node3 30582 1726855323.48475: Calling groups_inventory to load vars for managed_node3 30582 1726855323.48477: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855323.48791: Calling all_plugins_play to load vars for managed_node3 30582 1726855323.48795: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855323.48800: Calling groups_plugins_play to load vars for managed_node3 30582 1726855323.49358: WORKER PROCESS EXITING 30582 1726855323.51600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855323.55224: done with get_vars() 30582 1726855323.55251: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855323.55335: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:03 -0400 (0:00:00.207) 0:00:59.903 ****** 30582 1726855323.55374: entering _queue_task() for managed_node3/yum 30582 1726855323.56297: worker is 1 (out of 1 available) 30582 1726855323.56308: exiting _queue_task() for managed_node3/yum 30582 1726855323.56317: done queuing things up, now waiting for results queue to drain 30582 1726855323.56318: waiting for pending results... 30582 1726855323.56607: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855323.56613: in run() - task 0affcc66-ac2b-aa83-7d57-000000001280 30582 1726855323.56616: variable 'ansible_search_path' from source: unknown 30582 1726855323.56620: variable 'ansible_search_path' from source: unknown 30582 1726855323.56686: calling self._execute() 30582 1726855323.56777: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855323.56781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855323.56809: variable 'omit' from source: magic vars 30582 1726855323.57213: variable 'ansible_distribution_major_version' from source: facts 30582 1726855323.57393: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855323.57437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855323.70626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855323.70693: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855323.70728: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855323.70759: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855323.70807: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855323.70868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.70906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.70928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.71041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.71044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.71077: variable 'ansible_distribution_major_version' from source: facts 30582 1726855323.71093: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855323.71097: when evaluation is False, skipping this task 30582 1726855323.71104: _execute() done 30582 1726855323.71107: dumping result to json 30582 1726855323.71109: done dumping result, returning 30582 1726855323.71118: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001280] 30582 1726855323.71121: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001280 30582 1726855323.71351: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001280 30582 1726855323.71353: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855323.71447: no more pending results, returning what we have 30582 1726855323.71451: results queue empty 30582 1726855323.71452: checking for any_errors_fatal 30582 1726855323.71457: done checking for any_errors_fatal 30582 1726855323.71458: checking for max_fail_percentage 30582 1726855323.71460: done checking for max_fail_percentage 30582 1726855323.71461: checking to see if all hosts have failed and the running result is not ok 30582 1726855323.71462: done checking to see if all hosts have failed 30582 1726855323.71462: getting the remaining hosts for this loop 30582 1726855323.71464: done getting the remaining hosts for this loop 30582 1726855323.71467: getting the next task for host managed_node3 30582 1726855323.71474: done getting next task for host managed_node3 30582 1726855323.71478: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855323.71483: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855323.71505: getting variables 30582 1726855323.71506: in VariableManager get_vars() 30582 1726855323.71540: Calling all_inventory to load vars for managed_node3 30582 1726855323.71543: Calling groups_inventory to load vars for managed_node3 30582 1726855323.71544: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855323.71552: Calling all_plugins_play to load vars for managed_node3 30582 1726855323.71555: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855323.71557: Calling groups_plugins_play to load vars for managed_node3 30582 1726855323.87391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855323.91599: done with get_vars() 30582 1726855323.91634: done getting variables 30582 1726855323.91685: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:03 -0400 (0:00:00.363) 0:01:00.267 ****** 30582 1726855323.91720: entering _queue_task() for managed_node3/fail 30582 1726855323.92476: worker is 1 (out of 1 available) 30582 1726855323.92491: exiting _queue_task() for managed_node3/fail 30582 1726855323.92503: done queuing things up, now waiting for results queue to drain 30582 1726855323.92505: waiting for pending results... 30582 1726855323.93207: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855323.93394: in run() - task 0affcc66-ac2b-aa83-7d57-000000001281 30582 1726855323.93399: variable 'ansible_search_path' from source: unknown 30582 1726855323.93403: variable 'ansible_search_path' from source: unknown 30582 1726855323.93695: calling self._execute() 30582 1726855323.93712: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855323.93725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855323.93739: variable 'omit' from source: magic vars 30582 1726855323.94475: variable 'ansible_distribution_major_version' from source: facts 30582 1726855323.94606: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855323.94779: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855323.95136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855323.98809: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855323.98916: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855323.98976: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855323.99017: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855323.99070: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855323.99154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.99197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.99227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.99297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.99300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.99346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.99378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.99514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.99518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.99520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.99523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855323.99552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855323.99583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855323.99635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855323.99656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855323.99852: variable 'network_connections' from source: include params 30582 1726855323.99871: variable 'interface' from source: play vars 30582 1726855323.99951: variable 'interface' from source: play vars 30582 1726855324.00028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855324.00229: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855324.00278: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855324.00315: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855324.00346: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855324.00398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855324.00424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855324.00496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.00499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855324.00539: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855324.00804: variable 'network_connections' from source: include params 30582 1726855324.00822: variable 'interface' from source: play vars 30582 1726855324.00884: variable 'interface' from source: play vars 30582 1726855324.00914: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855324.00992: when evaluation is False, skipping this task 30582 1726855324.00998: _execute() done 30582 1726855324.01001: dumping result to json 30582 1726855324.01003: done dumping result, returning 30582 1726855324.01005: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001281] 30582 1726855324.01007: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001281 30582 1726855324.01096: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001281 30582 1726855324.01099: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855324.01157: no more pending results, returning what we have 30582 1726855324.01161: results queue empty 30582 1726855324.01162: checking for any_errors_fatal 30582 1726855324.01173: done checking for any_errors_fatal 30582 1726855324.01174: checking for max_fail_percentage 30582 1726855324.01176: done checking for max_fail_percentage 30582 1726855324.01177: checking to see if all hosts have failed and the running result is not ok 30582 1726855324.01178: done checking to see if all hosts have failed 30582 1726855324.01179: getting the remaining hosts for this loop 30582 1726855324.01180: done getting the remaining hosts for this loop 30582 1726855324.01185: getting the next task for host managed_node3 30582 1726855324.01196: done getting next task for host managed_node3 30582 1726855324.01201: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855324.01207: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855324.01232: getting variables 30582 1726855324.01233: in VariableManager get_vars() 30582 1726855324.01275: Calling all_inventory to load vars for managed_node3 30582 1726855324.01278: Calling groups_inventory to load vars for managed_node3 30582 1726855324.01281: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855324.01407: Calling all_plugins_play to load vars for managed_node3 30582 1726855324.01411: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855324.01415: Calling groups_plugins_play to load vars for managed_node3 30582 1726855324.03079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855324.04861: done with get_vars() 30582 1726855324.04893: done getting variables 30582 1726855324.04949: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:04 -0400 (0:00:00.132) 0:01:00.399 ****** 30582 1726855324.04990: entering _queue_task() for managed_node3/package 30582 1726855324.05431: worker is 1 (out of 1 available) 30582 1726855324.05443: exiting _queue_task() for managed_node3/package 30582 1726855324.05452: done queuing things up, now waiting for results queue to drain 30582 1726855324.05454: waiting for pending results... 30582 1726855324.05761: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855324.05858: in run() - task 0affcc66-ac2b-aa83-7d57-000000001282 30582 1726855324.05967: variable 'ansible_search_path' from source: unknown 30582 1726855324.05971: variable 'ansible_search_path' from source: unknown 30582 1726855324.05975: calling self._execute() 30582 1726855324.06035: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855324.06047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855324.06061: variable 'omit' from source: magic vars 30582 1726855324.06443: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.06459: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855324.06667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855324.06961: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855324.07013: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855324.07055: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855324.07135: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855324.07262: variable 'network_packages' from source: role '' defaults 30582 1726855324.07383: variable '__network_provider_setup' from source: role '' defaults 30582 1726855324.07402: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855324.07493: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855324.07496: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855324.07551: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855324.07750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855324.09952: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855324.10060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855324.10064: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855324.10093: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855324.10121: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855324.10216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.10246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.10281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.10325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.10344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.10495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.10498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.10501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.10510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.10529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.10768: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855324.10888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.10929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.10959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.11042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.11046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.11135: variable 'ansible_python' from source: facts 30582 1726855324.11164: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855324.11255: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855324.11364: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855324.11473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.11503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.11581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.11585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.11605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.11652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.11905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.11908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.11911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.11913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.12238: variable 'network_connections' from source: include params 30582 1726855324.12395: variable 'interface' from source: play vars 30582 1726855324.12522: variable 'interface' from source: play vars 30582 1726855324.12675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855324.12709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855324.12809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.12858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855324.13101: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855324.13993: variable 'network_connections' from source: include params 30582 1726855324.13998: variable 'interface' from source: play vars 30582 1726855324.14240: variable 'interface' from source: play vars 30582 1726855324.14327: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855324.14880: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855324.15637: variable 'network_connections' from source: include params 30582 1726855324.15648: variable 'interface' from source: play vars 30582 1726855324.15828: variable 'interface' from source: play vars 30582 1726855324.15855: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855324.16246: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855324.16842: variable 'network_connections' from source: include params 30582 1726855324.16853: variable 'interface' from source: play vars 30582 1726855324.17023: variable 'interface' from source: play vars 30582 1726855324.17082: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855324.17178: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855324.17443: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855324.17447: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855324.17817: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855324.18880: variable 'network_connections' from source: include params 30582 1726855324.18895: variable 'interface' from source: play vars 30582 1726855324.18959: variable 'interface' from source: play vars 30582 1726855324.19086: variable 'ansible_distribution' from source: facts 30582 1726855324.19098: variable '__network_rh_distros' from source: role '' defaults 30582 1726855324.19107: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.19125: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855324.19592: variable 'ansible_distribution' from source: facts 30582 1726855324.19596: variable '__network_rh_distros' from source: role '' defaults 30582 1726855324.19598: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.19600: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855324.19804: variable 'ansible_distribution' from source: facts 30582 1726855324.20094: variable '__network_rh_distros' from source: role '' defaults 30582 1726855324.20097: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.20100: variable 'network_provider' from source: set_fact 30582 1726855324.20102: variable 'ansible_facts' from source: unknown 30582 1726855324.21553: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855324.21562: when evaluation is False, skipping this task 30582 1726855324.21569: _execute() done 30582 1726855324.21625: dumping result to json 30582 1726855324.21633: done dumping result, returning 30582 1726855324.21646: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000001282] 30582 1726855324.21655: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001282 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855324.21816: no more pending results, returning what we have 30582 1726855324.21820: results queue empty 30582 1726855324.21821: checking for any_errors_fatal 30582 1726855324.21827: done checking for any_errors_fatal 30582 1726855324.21827: checking for max_fail_percentage 30582 1726855324.21830: done checking for max_fail_percentage 30582 1726855324.21831: checking to see if all hosts have failed and the running result is not ok 30582 1726855324.21831: done checking to see if all hosts have failed 30582 1726855324.21832: getting the remaining hosts for this loop 30582 1726855324.21833: done getting the remaining hosts for this loop 30582 1726855324.21838: getting the next task for host managed_node3 30582 1726855324.21848: done getting next task for host managed_node3 30582 1726855324.21852: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855324.21859: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855324.21885: getting variables 30582 1726855324.21889: in VariableManager get_vars() 30582 1726855324.22032: Calling all_inventory to load vars for managed_node3 30582 1726855324.22035: Calling groups_inventory to load vars for managed_node3 30582 1726855324.22042: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855324.22053: Calling all_plugins_play to load vars for managed_node3 30582 1726855324.22056: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855324.22059: Calling groups_plugins_play to load vars for managed_node3 30582 1726855324.23196: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001282 30582 1726855324.23199: WORKER PROCESS EXITING 30582 1726855324.25317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855324.28955: done with get_vars() 30582 1726855324.28993: done getting variables 30582 1726855324.29177: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:04 -0400 (0:00:00.242) 0:01:00.642 ****** 30582 1726855324.29220: entering _queue_task() for managed_node3/package 30582 1726855324.30155: worker is 1 (out of 1 available) 30582 1726855324.30167: exiting _queue_task() for managed_node3/package 30582 1726855324.30177: done queuing things up, now waiting for results queue to drain 30582 1726855324.30179: waiting for pending results... 30582 1726855324.30699: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855324.30961: in run() - task 0affcc66-ac2b-aa83-7d57-000000001283 30582 1726855324.31029: variable 'ansible_search_path' from source: unknown 30582 1726855324.31039: variable 'ansible_search_path' from source: unknown 30582 1726855324.31086: calling self._execute() 30582 1726855324.31329: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855324.31447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855324.31451: variable 'omit' from source: magic vars 30582 1726855324.32310: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.32336: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855324.32758: variable 'network_state' from source: role '' defaults 30582 1726855324.32763: Evaluated conditional (network_state != {}): False 30582 1726855324.32766: when evaluation is False, skipping this task 30582 1726855324.32769: _execute() done 30582 1726855324.32772: dumping result to json 30582 1726855324.32774: done dumping result, returning 30582 1726855324.32777: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001283] 30582 1726855324.32780: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001283 30582 1726855324.33060: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001283 30582 1726855324.33064: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855324.33240: no more pending results, returning what we have 30582 1726855324.33244: results queue empty 30582 1726855324.33246: checking for any_errors_fatal 30582 1726855324.33255: done checking for any_errors_fatal 30582 1726855324.33256: checking for max_fail_percentage 30582 1726855324.33258: done checking for max_fail_percentage 30582 1726855324.33259: checking to see if all hosts have failed and the running result is not ok 30582 1726855324.33260: done checking to see if all hosts have failed 30582 1726855324.33260: getting the remaining hosts for this loop 30582 1726855324.33263: done getting the remaining hosts for this loop 30582 1726855324.33267: getting the next task for host managed_node3 30582 1726855324.33277: done getting next task for host managed_node3 30582 1726855324.33281: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855324.33289: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855324.33320: getting variables 30582 1726855324.33323: in VariableManager get_vars() 30582 1726855324.33366: Calling all_inventory to load vars for managed_node3 30582 1726855324.33370: Calling groups_inventory to load vars for managed_node3 30582 1726855324.33372: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855324.33386: Calling all_plugins_play to load vars for managed_node3 30582 1726855324.33616: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855324.33621: Calling groups_plugins_play to load vars for managed_node3 30582 1726855324.36868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855324.40229: done with get_vars() 30582 1726855324.40264: done getting variables 30582 1726855324.40441: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:04 -0400 (0:00:00.112) 0:01:00.754 ****** 30582 1726855324.40478: entering _queue_task() for managed_node3/package 30582 1726855324.41471: worker is 1 (out of 1 available) 30582 1726855324.41485: exiting _queue_task() for managed_node3/package 30582 1726855324.41498: done queuing things up, now waiting for results queue to drain 30582 1726855324.41500: waiting for pending results... 30582 1726855324.42025: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855324.42252: in run() - task 0affcc66-ac2b-aa83-7d57-000000001284 30582 1726855324.42459: variable 'ansible_search_path' from source: unknown 30582 1726855324.42463: variable 'ansible_search_path' from source: unknown 30582 1726855324.42465: calling self._execute() 30582 1726855324.42598: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855324.42685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855324.42703: variable 'omit' from source: magic vars 30582 1726855324.43545: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.43564: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855324.43814: variable 'network_state' from source: role '' defaults 30582 1726855324.43878: Evaluated conditional (network_state != {}): False 30582 1726855324.43889: when evaluation is False, skipping this task 30582 1726855324.44083: _execute() done 30582 1726855324.44088: dumping result to json 30582 1726855324.44091: done dumping result, returning 30582 1726855324.44094: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001284] 30582 1726855324.44096: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001284 30582 1726855324.44171: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001284 30582 1726855324.44174: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855324.44240: no more pending results, returning what we have 30582 1726855324.44244: results queue empty 30582 1726855324.44246: checking for any_errors_fatal 30582 1726855324.44254: done checking for any_errors_fatal 30582 1726855324.44255: checking for max_fail_percentage 30582 1726855324.44258: done checking for max_fail_percentage 30582 1726855324.44259: checking to see if all hosts have failed and the running result is not ok 30582 1726855324.44260: done checking to see if all hosts have failed 30582 1726855324.44261: getting the remaining hosts for this loop 30582 1726855324.44263: done getting the remaining hosts for this loop 30582 1726855324.44267: getting the next task for host managed_node3 30582 1726855324.44278: done getting next task for host managed_node3 30582 1726855324.44282: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855324.44290: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855324.44320: getting variables 30582 1726855324.44322: in VariableManager get_vars() 30582 1726855324.44367: Calling all_inventory to load vars for managed_node3 30582 1726855324.44370: Calling groups_inventory to load vars for managed_node3 30582 1726855324.44373: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855324.44810: Calling all_plugins_play to load vars for managed_node3 30582 1726855324.44816: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855324.44820: Calling groups_plugins_play to load vars for managed_node3 30582 1726855324.47894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855324.51384: done with get_vars() 30582 1726855324.51510: done getting variables 30582 1726855324.51579: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:04 -0400 (0:00:00.111) 0:01:00.866 ****** 30582 1726855324.51622: entering _queue_task() for managed_node3/service 30582 1726855324.52479: worker is 1 (out of 1 available) 30582 1726855324.52726: exiting _queue_task() for managed_node3/service 30582 1726855324.52739: done queuing things up, now waiting for results queue to drain 30582 1726855324.52741: waiting for pending results... 30582 1726855324.53394: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855324.53694: in run() - task 0affcc66-ac2b-aa83-7d57-000000001285 30582 1726855324.53700: variable 'ansible_search_path' from source: unknown 30582 1726855324.53702: variable 'ansible_search_path' from source: unknown 30582 1726855324.53704: calling self._execute() 30582 1726855324.53869: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855324.53881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855324.53898: variable 'omit' from source: magic vars 30582 1726855324.54865: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.54869: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855324.55147: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855324.55564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855324.61695: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855324.61700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855324.61702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855324.61704: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855324.61706: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855324.61874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.62029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.62058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.62136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.62250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.62304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.62334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.62419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.62671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.62675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.62677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.62679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.62891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.62895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.62897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.63237: variable 'network_connections' from source: include params 30582 1726855324.63258: variable 'interface' from source: play vars 30582 1726855324.63397: variable 'interface' from source: play vars 30582 1726855324.63621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855324.63927: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855324.64103: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855324.64138: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855324.64305: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855324.64356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855324.64510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855324.64542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.64574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855324.64656: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855324.65252: variable 'network_connections' from source: include params 30582 1726855324.65368: variable 'interface' from source: play vars 30582 1726855324.65440: variable 'interface' from source: play vars 30582 1726855324.65492: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855324.65533: when evaluation is False, skipping this task 30582 1726855324.65541: _execute() done 30582 1726855324.65548: dumping result to json 30582 1726855324.65554: done dumping result, returning 30582 1726855324.65596: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001285] 30582 1726855324.65606: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001285 30582 1726855324.66019: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001285 30582 1726855324.66030: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855324.66080: no more pending results, returning what we have 30582 1726855324.66084: results queue empty 30582 1726855324.66085: checking for any_errors_fatal 30582 1726855324.66094: done checking for any_errors_fatal 30582 1726855324.66095: checking for max_fail_percentage 30582 1726855324.66099: done checking for max_fail_percentage 30582 1726855324.66100: checking to see if all hosts have failed and the running result is not ok 30582 1726855324.66101: done checking to see if all hosts have failed 30582 1726855324.66101: getting the remaining hosts for this loop 30582 1726855324.66103: done getting the remaining hosts for this loop 30582 1726855324.66107: getting the next task for host managed_node3 30582 1726855324.66116: done getting next task for host managed_node3 30582 1726855324.66121: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855324.66126: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855324.66152: getting variables 30582 1726855324.66154: in VariableManager get_vars() 30582 1726855324.66315: Calling all_inventory to load vars for managed_node3 30582 1726855324.66319: Calling groups_inventory to load vars for managed_node3 30582 1726855324.66321: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855324.66332: Calling all_plugins_play to load vars for managed_node3 30582 1726855324.66336: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855324.66339: Calling groups_plugins_play to load vars for managed_node3 30582 1726855324.71259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855324.76845: done with get_vars() 30582 1726855324.76890: done getting variables 30582 1726855324.77139: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:04 -0400 (0:00:00.255) 0:01:01.121 ****** 30582 1726855324.77178: entering _queue_task() for managed_node3/service 30582 1726855324.78654: worker is 1 (out of 1 available) 30582 1726855324.78666: exiting _queue_task() for managed_node3/service 30582 1726855324.78678: done queuing things up, now waiting for results queue to drain 30582 1726855324.78680: waiting for pending results... 30582 1726855324.79206: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855324.79249: in run() - task 0affcc66-ac2b-aa83-7d57-000000001286 30582 1726855324.79272: variable 'ansible_search_path' from source: unknown 30582 1726855324.79282: variable 'ansible_search_path' from source: unknown 30582 1726855324.79325: calling self._execute() 30582 1726855324.79596: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855324.79611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855324.79628: variable 'omit' from source: magic vars 30582 1726855324.80410: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.80429: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855324.80993: variable 'network_provider' from source: set_fact 30582 1726855324.80997: variable 'network_state' from source: role '' defaults 30582 1726855324.81000: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855324.81003: variable 'omit' from source: magic vars 30582 1726855324.81004: variable 'omit' from source: magic vars 30582 1726855324.81006: variable 'network_service_name' from source: role '' defaults 30582 1726855324.81008: variable 'network_service_name' from source: role '' defaults 30582 1726855324.81281: variable '__network_provider_setup' from source: role '' defaults 30582 1726855324.81593: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855324.81597: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855324.81599: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855324.81637: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855324.82049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855324.86638: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855324.86831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855324.86924: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855324.87026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855324.87058: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855324.87172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.87343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.87371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.87593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.87597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.87599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.87627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.87721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.87769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.87871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.88348: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855324.88581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.88895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.88899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.88902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.88913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.89113: variable 'ansible_python' from source: facts 30582 1726855324.89137: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855324.89256: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855324.89555: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855324.89804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.89833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.89860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.89911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.90002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.90052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855324.90117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855324.90222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.90265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855324.90492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855324.90645: variable 'network_connections' from source: include params 30582 1726855324.90660: variable 'interface' from source: play vars 30582 1726855324.90822: variable 'interface' from source: play vars 30582 1726855324.91036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855324.91456: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855324.91638: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855324.91683: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855324.91774: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855324.91891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855324.92035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855324.92107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855324.92180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855324.92311: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855324.92934: variable 'network_connections' from source: include params 30582 1726855324.92946: variable 'interface' from source: play vars 30582 1726855324.93060: variable 'interface' from source: play vars 30582 1726855324.93171: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855324.93350: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855324.94007: variable 'network_connections' from source: include params 30582 1726855324.94092: variable 'interface' from source: play vars 30582 1726855324.94292: variable 'interface' from source: play vars 30582 1726855324.94296: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855324.94384: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855324.95100: variable 'network_connections' from source: include params 30582 1726855324.95111: variable 'interface' from source: play vars 30582 1726855324.95180: variable 'interface' from source: play vars 30582 1726855324.95364: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855324.95517: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855324.95534: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855324.95657: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855324.96693: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855324.97642: variable 'network_connections' from source: include params 30582 1726855324.97903: variable 'interface' from source: play vars 30582 1726855324.98092: variable 'interface' from source: play vars 30582 1726855324.98095: variable 'ansible_distribution' from source: facts 30582 1726855324.98098: variable '__network_rh_distros' from source: role '' defaults 30582 1726855324.98100: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.98102: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855324.98365: variable 'ansible_distribution' from source: facts 30582 1726855324.98375: variable '__network_rh_distros' from source: role '' defaults 30582 1726855324.98390: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.98411: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855324.98892: variable 'ansible_distribution' from source: facts 30582 1726855324.98896: variable '__network_rh_distros' from source: role '' defaults 30582 1726855324.98898: variable 'ansible_distribution_major_version' from source: facts 30582 1726855324.98900: variable 'network_provider' from source: set_fact 30582 1726855324.98902: variable 'omit' from source: magic vars 30582 1726855324.98904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855324.99292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855324.99295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855324.99297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855324.99299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855324.99301: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855324.99303: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855324.99305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855324.99307: Set connection var ansible_timeout to 10 30582 1726855324.99309: Set connection var ansible_connection to ssh 30582 1726855324.99311: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855324.99500: Set connection var ansible_pipelining to False 30582 1726855324.99512: Set connection var ansible_shell_executable to /bin/sh 30582 1726855324.99518: Set connection var ansible_shell_type to sh 30582 1726855324.99546: variable 'ansible_shell_executable' from source: unknown 30582 1726855324.99554: variable 'ansible_connection' from source: unknown 30582 1726855324.99561: variable 'ansible_module_compression' from source: unknown 30582 1726855324.99567: variable 'ansible_shell_type' from source: unknown 30582 1726855324.99573: variable 'ansible_shell_executable' from source: unknown 30582 1726855324.99579: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855324.99586: variable 'ansible_pipelining' from source: unknown 30582 1726855324.99595: variable 'ansible_timeout' from source: unknown 30582 1726855324.99602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855325.00092: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855325.00102: variable 'omit' from source: magic vars 30582 1726855325.00104: starting attempt loop 30582 1726855325.00107: running the handler 30582 1726855325.00109: variable 'ansible_facts' from source: unknown 30582 1726855325.01945: _low_level_execute_command(): starting 30582 1726855325.01962: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855325.03227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855325.03305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855325.03535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855325.03559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855325.03651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855325.05378: stdout chunk (state=3): >>>/root <<< 30582 1726855325.05517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855325.05529: stdout chunk (state=3): >>><<< 30582 1726855325.05542: stderr chunk (state=3): >>><<< 30582 1726855325.05569: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855325.05592: _low_level_execute_command(): starting 30582 1726855325.05603: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339 `" && echo ansible-tmp-1726855325.055776-33468-53098114885339="` echo /root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339 `" ) && sleep 0' 30582 1726855325.07399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855325.07420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855325.07711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855325.07724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855325.07774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855325.07799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855325.08225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855325.08299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855325.10275: stdout chunk (state=3): >>>ansible-tmp-1726855325.055776-33468-53098114885339=/root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339 <<< 30582 1726855325.10519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855325.10522: stdout chunk (state=3): >>><<< 30582 1726855325.10524: stderr chunk (state=3): >>><<< 30582 1726855325.10540: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855325.055776-33468-53098114885339=/root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855325.10578: variable 'ansible_module_compression' from source: unknown 30582 1726855325.10641: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855325.10761: variable 'ansible_facts' from source: unknown 30582 1726855325.11405: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/AnsiballZ_systemd.py 30582 1726855325.11704: Sending initial data 30582 1726855325.11713: Sent initial data (154 bytes) 30582 1726855325.12767: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855325.12805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855325.12822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855325.12848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855325.12956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855325.13093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855325.13115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855325.13297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855325.15196: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855325.15232: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855325.15293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpf1m4zwr5 /root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/AnsiballZ_systemd.py <<< 30582 1726855325.15305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/AnsiballZ_systemd.py" <<< 30582 1726855325.15360: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpf1m4zwr5" to remote "/root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/AnsiballZ_systemd.py" <<< 30582 1726855325.18245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855325.18262: stdout chunk (state=3): >>><<< 30582 1726855325.18295: stderr chunk (state=3): >>><<< 30582 1726855325.18495: done transferring module to remote 30582 1726855325.18498: _low_level_execute_command(): starting 30582 1726855325.18500: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/ /root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/AnsiballZ_systemd.py && sleep 0' 30582 1726855325.19965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855325.20070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855325.20192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855325.20215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855325.20313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855325.22356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855325.22370: stdout chunk (state=3): >>><<< 30582 1726855325.22386: stderr chunk (state=3): >>><<< 30582 1726855325.22408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855325.22416: _low_level_execute_command(): starting 30582 1726855325.22666: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/AnsiballZ_systemd.py && sleep 0' 30582 1726855325.24101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855325.24206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855325.24410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855325.24455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855325.24520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855325.53853: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10665984", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3319308288", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2144691000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855325.55682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855325.55696: stdout chunk (state=3): >>><<< 30582 1726855325.55709: stderr chunk (state=3): >>><<< 30582 1726855325.55738: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10665984", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3319308288", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2144691000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855325.56170: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855325.56201: _low_level_execute_command(): starting 30582 1726855325.56215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855325.055776-33468-53098114885339/ > /dev/null 2>&1 && sleep 0' 30582 1726855325.57743: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855325.57850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855325.57868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855325.57981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855325.59860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855325.59863: stdout chunk (state=3): >>><<< 30582 1726855325.59875: stderr chunk (state=3): >>><<< 30582 1726855325.59903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855325.59907: handler run complete 30582 1726855325.60058: attempt loop complete, returning result 30582 1726855325.60061: _execute() done 30582 1726855325.60064: dumping result to json 30582 1726855325.60088: done dumping result, returning 30582 1726855325.60142: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-000000001286] 30582 1726855325.60145: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001286 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855325.60880: no more pending results, returning what we have 30582 1726855325.60884: results queue empty 30582 1726855325.60885: checking for any_errors_fatal 30582 1726855325.60891: done checking for any_errors_fatal 30582 1726855325.60892: checking for max_fail_percentage 30582 1726855325.60894: done checking for max_fail_percentage 30582 1726855325.60896: checking to see if all hosts have failed and the running result is not ok 30582 1726855325.60897: done checking to see if all hosts have failed 30582 1726855325.60897: getting the remaining hosts for this loop 30582 1726855325.60899: done getting the remaining hosts for this loop 30582 1726855325.60903: getting the next task for host managed_node3 30582 1726855325.60912: done getting next task for host managed_node3 30582 1726855325.60916: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855325.60923: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855325.60942: getting variables 30582 1726855325.60945: in VariableManager get_vars() 30582 1726855325.60986: Calling all_inventory to load vars for managed_node3 30582 1726855325.61232: Calling groups_inventory to load vars for managed_node3 30582 1726855325.61235: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855325.61275: Calling all_plugins_play to load vars for managed_node3 30582 1726855325.61296: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855325.61495: Calling groups_plugins_play to load vars for managed_node3 30582 1726855325.62119: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001286 30582 1726855325.62123: WORKER PROCESS EXITING 30582 1726855325.66050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855325.70034: done with get_vars() 30582 1726855325.70141: done getting variables 30582 1726855325.70300: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:05 -0400 (0:00:00.933) 0:01:02.054 ****** 30582 1726855325.70507: entering _queue_task() for managed_node3/service 30582 1726855325.71583: worker is 1 (out of 1 available) 30582 1726855325.71601: exiting _queue_task() for managed_node3/service 30582 1726855325.71613: done queuing things up, now waiting for results queue to drain 30582 1726855325.71615: waiting for pending results... 30582 1726855325.72026: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855325.72168: in run() - task 0affcc66-ac2b-aa83-7d57-000000001287 30582 1726855325.72180: variable 'ansible_search_path' from source: unknown 30582 1726855325.72183: variable 'ansible_search_path' from source: unknown 30582 1726855325.72220: calling self._execute() 30582 1726855325.72338: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855325.72342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855325.72352: variable 'omit' from source: magic vars 30582 1726855325.72744: variable 'ansible_distribution_major_version' from source: facts 30582 1726855325.72754: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855325.72975: variable 'network_provider' from source: set_fact 30582 1726855325.72979: Evaluated conditional (network_provider == "nm"): True 30582 1726855325.73145: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855325.73275: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855325.73514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855325.76432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855325.76545: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855325.76654: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855325.76735: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855325.76738: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855325.76953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855325.76957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855325.76960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855325.77090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855325.77094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855325.77106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855325.77140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855325.77158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855325.77253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855325.77279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855325.77349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855325.77373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855325.77632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855325.77658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855325.77675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855325.78150: variable 'network_connections' from source: include params 30582 1726855325.78164: variable 'interface' from source: play vars 30582 1726855325.78475: variable 'interface' from source: play vars 30582 1726855325.78743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855325.79175: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855325.79219: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855325.79357: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855325.79407: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855325.79482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855325.79545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855325.79549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855325.79685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855325.79691: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855325.79889: variable 'network_connections' from source: include params 30582 1726855325.79895: variable 'interface' from source: play vars 30582 1726855325.79962: variable 'interface' from source: play vars 30582 1726855325.79994: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855325.79999: when evaluation is False, skipping this task 30582 1726855325.80009: _execute() done 30582 1726855325.80012: dumping result to json 30582 1726855325.80014: done dumping result, returning 30582 1726855325.80024: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-000000001287] 30582 1726855325.80035: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001287 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855325.80260: no more pending results, returning what we have 30582 1726855325.80264: results queue empty 30582 1726855325.80265: checking for any_errors_fatal 30582 1726855325.80291: done checking for any_errors_fatal 30582 1726855325.80292: checking for max_fail_percentage 30582 1726855325.80294: done checking for max_fail_percentage 30582 1726855325.80295: checking to see if all hosts have failed and the running result is not ok 30582 1726855325.80295: done checking to see if all hosts have failed 30582 1726855325.80296: getting the remaining hosts for this loop 30582 1726855325.80297: done getting the remaining hosts for this loop 30582 1726855325.80392: getting the next task for host managed_node3 30582 1726855325.80400: done getting next task for host managed_node3 30582 1726855325.80404: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855325.80415: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855325.80431: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001287 30582 1726855325.80436: WORKER PROCESS EXITING 30582 1726855325.80453: getting variables 30582 1726855325.80455: in VariableManager get_vars() 30582 1726855325.80524: Calling all_inventory to load vars for managed_node3 30582 1726855325.80528: Calling groups_inventory to load vars for managed_node3 30582 1726855325.80530: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855325.80539: Calling all_plugins_play to load vars for managed_node3 30582 1726855325.80542: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855325.80567: Calling groups_plugins_play to load vars for managed_node3 30582 1726855325.84038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855325.86063: done with get_vars() 30582 1726855325.86101: done getting variables 30582 1726855325.86197: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:05 -0400 (0:00:00.157) 0:01:02.212 ****** 30582 1726855325.86234: entering _queue_task() for managed_node3/service 30582 1726855325.86921: worker is 1 (out of 1 available) 30582 1726855325.86932: exiting _queue_task() for managed_node3/service 30582 1726855325.86943: done queuing things up, now waiting for results queue to drain 30582 1726855325.86944: waiting for pending results... 30582 1726855325.87341: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855325.87711: in run() - task 0affcc66-ac2b-aa83-7d57-000000001288 30582 1726855325.87774: variable 'ansible_search_path' from source: unknown 30582 1726855325.87778: variable 'ansible_search_path' from source: unknown 30582 1726855325.87780: calling self._execute() 30582 1726855325.87895: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855325.87932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855325.87992: variable 'omit' from source: magic vars 30582 1726855325.88648: variable 'ansible_distribution_major_version' from source: facts 30582 1726855325.88666: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855325.88904: variable 'network_provider' from source: set_fact 30582 1726855325.88936: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855325.88945: when evaluation is False, skipping this task 30582 1726855325.88952: _execute() done 30582 1726855325.88959: dumping result to json 30582 1726855325.88972: done dumping result, returning 30582 1726855325.89010: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-000000001288] 30582 1726855325.89022: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001288 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855325.89359: no more pending results, returning what we have 30582 1726855325.89363: results queue empty 30582 1726855325.89364: checking for any_errors_fatal 30582 1726855325.89375: done checking for any_errors_fatal 30582 1726855325.89375: checking for max_fail_percentage 30582 1726855325.89378: done checking for max_fail_percentage 30582 1726855325.89379: checking to see if all hosts have failed and the running result is not ok 30582 1726855325.89380: done checking to see if all hosts have failed 30582 1726855325.89380: getting the remaining hosts for this loop 30582 1726855325.89382: done getting the remaining hosts for this loop 30582 1726855325.89590: getting the next task for host managed_node3 30582 1726855325.89599: done getting next task for host managed_node3 30582 1726855325.89603: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855325.89610: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855325.89640: getting variables 30582 1726855325.89642: in VariableManager get_vars() 30582 1726855325.89682: Calling all_inventory to load vars for managed_node3 30582 1726855325.89685: Calling groups_inventory to load vars for managed_node3 30582 1726855325.89747: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855325.89758: Calling all_plugins_play to load vars for managed_node3 30582 1726855325.89762: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855325.89765: Calling groups_plugins_play to load vars for managed_node3 30582 1726855325.89800: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001288 30582 1726855325.89807: WORKER PROCESS EXITING 30582 1726855325.91738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855325.93432: done with get_vars() 30582 1726855325.93459: done getting variables 30582 1726855325.93529: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:05 -0400 (0:00:00.073) 0:01:02.285 ****** 30582 1726855325.93566: entering _queue_task() for managed_node3/copy 30582 1726855325.93974: worker is 1 (out of 1 available) 30582 1726855325.93990: exiting _queue_task() for managed_node3/copy 30582 1726855325.94003: done queuing things up, now waiting for results queue to drain 30582 1726855325.94005: waiting for pending results... 30582 1726855325.94279: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855325.94418: in run() - task 0affcc66-ac2b-aa83-7d57-000000001289 30582 1726855325.94432: variable 'ansible_search_path' from source: unknown 30582 1726855325.94435: variable 'ansible_search_path' from source: unknown 30582 1726855325.94479: calling self._execute() 30582 1726855325.94565: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855325.94568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855325.94584: variable 'omit' from source: magic vars 30582 1726855325.94950: variable 'ansible_distribution_major_version' from source: facts 30582 1726855325.95013: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855325.95084: variable 'network_provider' from source: set_fact 30582 1726855325.95090: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855325.95093: when evaluation is False, skipping this task 30582 1726855325.95096: _execute() done 30582 1726855325.95118: dumping result to json 30582 1726855325.95122: done dumping result, returning 30582 1726855325.95134: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-000000001289] 30582 1726855325.95136: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001289 30582 1726855325.95356: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001289 30582 1726855325.95359: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855325.95401: no more pending results, returning what we have 30582 1726855325.95404: results queue empty 30582 1726855325.95405: checking for any_errors_fatal 30582 1726855325.95409: done checking for any_errors_fatal 30582 1726855325.95410: checking for max_fail_percentage 30582 1726855325.95412: done checking for max_fail_percentage 30582 1726855325.95413: checking to see if all hosts have failed and the running result is not ok 30582 1726855325.95413: done checking to see if all hosts have failed 30582 1726855325.95414: getting the remaining hosts for this loop 30582 1726855325.95415: done getting the remaining hosts for this loop 30582 1726855325.95418: getting the next task for host managed_node3 30582 1726855325.95425: done getting next task for host managed_node3 30582 1726855325.95428: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855325.95433: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855325.95454: getting variables 30582 1726855325.95455: in VariableManager get_vars() 30582 1726855325.95493: Calling all_inventory to load vars for managed_node3 30582 1726855325.95496: Calling groups_inventory to load vars for managed_node3 30582 1726855325.95498: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855325.95508: Calling all_plugins_play to load vars for managed_node3 30582 1726855325.95511: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855325.95514: Calling groups_plugins_play to load vars for managed_node3 30582 1726855325.96914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855325.98640: done with get_vars() 30582 1726855325.98668: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:05 -0400 (0:00:00.051) 0:01:02.337 ****** 30582 1726855325.98767: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855325.99154: worker is 1 (out of 1 available) 30582 1726855325.99283: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855325.99296: done queuing things up, now waiting for results queue to drain 30582 1726855325.99297: waiting for pending results... 30582 1726855325.99611: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855325.99658: in run() - task 0affcc66-ac2b-aa83-7d57-00000000128a 30582 1726855325.99682: variable 'ansible_search_path' from source: unknown 30582 1726855325.99692: variable 'ansible_search_path' from source: unknown 30582 1726855325.99740: calling self._execute() 30582 1726855325.99849: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855325.99862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855325.99879: variable 'omit' from source: magic vars 30582 1726855326.00358: variable 'ansible_distribution_major_version' from source: facts 30582 1726855326.00362: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855326.00364: variable 'omit' from source: magic vars 30582 1726855326.00398: variable 'omit' from source: magic vars 30582 1726855326.00574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855326.03244: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855326.03330: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855326.03372: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855326.03421: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855326.03451: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855326.03592: variable 'network_provider' from source: set_fact 30582 1726855326.03703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855326.03744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855326.03776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855326.03823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855326.03992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855326.03995: variable 'omit' from source: magic vars 30582 1726855326.04054: variable 'omit' from source: magic vars 30582 1726855326.04168: variable 'network_connections' from source: include params 30582 1726855326.04192: variable 'interface' from source: play vars 30582 1726855326.04264: variable 'interface' from source: play vars 30582 1726855326.04424: variable 'omit' from source: magic vars 30582 1726855326.04446: variable '__lsr_ansible_managed' from source: task vars 30582 1726855326.04511: variable '__lsr_ansible_managed' from source: task vars 30582 1726855326.04734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855326.04989: Loaded config def from plugin (lookup/template) 30582 1726855326.04999: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855326.05029: File lookup term: get_ansible_managed.j2 30582 1726855326.05035: variable 'ansible_search_path' from source: unknown 30582 1726855326.05045: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855326.05063: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855326.05098: variable 'ansible_search_path' from source: unknown 30582 1726855326.12108: variable 'ansible_managed' from source: unknown 30582 1726855326.12293: variable 'omit' from source: magic vars 30582 1726855326.12297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855326.12300: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855326.12327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855326.12353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855326.12367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855326.12412: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855326.12420: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.12443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.12552: Set connection var ansible_timeout to 10 30582 1726855326.12763: Set connection var ansible_connection to ssh 30582 1726855326.12767: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855326.12771: Set connection var ansible_pipelining to False 30582 1726855326.12779: Set connection var ansible_shell_executable to /bin/sh 30582 1726855326.12781: Set connection var ansible_shell_type to sh 30582 1726855326.12783: variable 'ansible_shell_executable' from source: unknown 30582 1726855326.12785: variable 'ansible_connection' from source: unknown 30582 1726855326.12794: variable 'ansible_module_compression' from source: unknown 30582 1726855326.12796: variable 'ansible_shell_type' from source: unknown 30582 1726855326.12798: variable 'ansible_shell_executable' from source: unknown 30582 1726855326.12800: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.12802: variable 'ansible_pipelining' from source: unknown 30582 1726855326.12804: variable 'ansible_timeout' from source: unknown 30582 1726855326.12806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.12838: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855326.12883: variable 'omit' from source: magic vars 30582 1726855326.12924: starting attempt loop 30582 1726855326.12927: running the handler 30582 1726855326.12930: _low_level_execute_command(): starting 30582 1726855326.12936: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855326.13810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855326.13815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855326.13830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855326.13856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855326.13955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855326.15812: stdout chunk (state=3): >>>/root <<< 30582 1726855326.15872: stdout chunk (state=3): >>><<< 30582 1726855326.16130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855326.16134: stderr chunk (state=3): >>><<< 30582 1726855326.16137: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855326.16140: _low_level_execute_command(): starting 30582 1726855326.16142: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021 `" && echo ansible-tmp-1726855326.1602402-33502-277553609082021="` echo /root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021 `" ) && sleep 0' 30582 1726855326.16804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855326.16868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855326.16883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855326.16889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855326.16982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855326.19434: stdout chunk (state=3): >>>ansible-tmp-1726855326.1602402-33502-277553609082021=/root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021 <<< 30582 1726855326.19439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855326.19441: stdout chunk (state=3): >>><<< 30582 1726855326.19444: stderr chunk (state=3): >>><<< 30582 1726855326.19446: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855326.1602402-33502-277553609082021=/root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855326.19448: variable 'ansible_module_compression' from source: unknown 30582 1726855326.19662: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855326.19706: variable 'ansible_facts' from source: unknown 30582 1726855326.19993: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/AnsiballZ_network_connections.py 30582 1726855326.20220: Sending initial data 30582 1726855326.20229: Sent initial data (168 bytes) 30582 1726855326.21144: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855326.21153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855326.21155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855326.21158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855326.21229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855326.22816: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855326.22861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855326.22937: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpl5iq6pzf /root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/AnsiballZ_network_connections.py <<< 30582 1726855326.22941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/AnsiballZ_network_connections.py" <<< 30582 1726855326.23015: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpl5iq6pzf" to remote "/root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/AnsiballZ_network_connections.py" <<< 30582 1726855326.24995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855326.25081: stderr chunk (state=3): >>><<< 30582 1726855326.25085: stdout chunk (state=3): >>><<< 30582 1726855326.25176: done transferring module to remote 30582 1726855326.25180: _low_level_execute_command(): starting 30582 1726855326.25182: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/ /root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/AnsiballZ_network_connections.py && sleep 0' 30582 1726855326.26164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855326.26495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855326.26512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855326.26778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855326.28694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855326.28697: stdout chunk (state=3): >>><<< 30582 1726855326.28700: stderr chunk (state=3): >>><<< 30582 1726855326.28702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855326.28705: _low_level_execute_command(): starting 30582 1726855326.28707: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/AnsiballZ_network_connections.py && sleep 0' 30582 1726855326.30348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855326.30352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855326.30355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855326.30395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855326.30400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855326.30403: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855326.30405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855326.30503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855326.30506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855326.30508: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855326.30510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855326.30512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855326.30559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855326.30562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855326.30565: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855326.30568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855326.30570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855326.30696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855326.55729: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855326.57794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855326.57802: stdout chunk (state=3): >>><<< 30582 1726855326.57805: stderr chunk (state=3): >>><<< 30582 1726855326.57808: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855326.57810: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855326.57813: _low_level_execute_command(): starting 30582 1726855326.57815: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855326.1602402-33502-277553609082021/ > /dev/null 2>&1 && sleep 0' 30582 1726855326.59307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855326.59404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855326.59503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855326.59628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855326.61478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855326.61638: stderr chunk (state=3): >>><<< 30582 1726855326.61647: stdout chunk (state=3): >>><<< 30582 1726855326.61669: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855326.61680: handler run complete 30582 1726855326.61712: attempt loop complete, returning result 30582 1726855326.61743: _execute() done 30582 1726855326.61751: dumping result to json 30582 1726855326.61761: done dumping result, returning 30582 1726855326.62100: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-00000000128a] 30582 1726855326.62104: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128a 30582 1726855326.62184: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128a 30582 1726855326.62190: WORKER PROCESS EXITING ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0 skipped because already active 30582 1726855326.62376: no more pending results, returning what we have 30582 1726855326.62380: results queue empty 30582 1726855326.62381: checking for any_errors_fatal 30582 1726855326.62385: done checking for any_errors_fatal 30582 1726855326.62386: checking for max_fail_percentage 30582 1726855326.62389: done checking for max_fail_percentage 30582 1726855326.62391: checking to see if all hosts have failed and the running result is not ok 30582 1726855326.62391: done checking to see if all hosts have failed 30582 1726855326.62392: getting the remaining hosts for this loop 30582 1726855326.62398: done getting the remaining hosts for this loop 30582 1726855326.62402: getting the next task for host managed_node3 30582 1726855326.62410: done getting next task for host managed_node3 30582 1726855326.62413: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855326.62418: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855326.62430: getting variables 30582 1726855326.62431: in VariableManager get_vars() 30582 1726855326.62852: Calling all_inventory to load vars for managed_node3 30582 1726855326.62855: Calling groups_inventory to load vars for managed_node3 30582 1726855326.62858: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855326.62867: Calling all_plugins_play to load vars for managed_node3 30582 1726855326.62872: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855326.62876: Calling groups_plugins_play to load vars for managed_node3 30582 1726855326.65412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855326.67558: done with get_vars() 30582 1726855326.67596: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:06 -0400 (0:00:00.689) 0:01:03.027 ****** 30582 1726855326.67712: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855326.68265: worker is 1 (out of 1 available) 30582 1726855326.68280: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855326.68331: done queuing things up, now waiting for results queue to drain 30582 1726855326.68333: waiting for pending results... 30582 1726855326.68440: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855326.68537: in run() - task 0affcc66-ac2b-aa83-7d57-00000000128b 30582 1726855326.68546: variable 'ansible_search_path' from source: unknown 30582 1726855326.68550: variable 'ansible_search_path' from source: unknown 30582 1726855326.68581: calling self._execute() 30582 1726855326.68660: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.68665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.68676: variable 'omit' from source: magic vars 30582 1726855326.68982: variable 'ansible_distribution_major_version' from source: facts 30582 1726855326.68993: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855326.69084: variable 'network_state' from source: role '' defaults 30582 1726855326.69094: Evaluated conditional (network_state != {}): False 30582 1726855326.69098: when evaluation is False, skipping this task 30582 1726855326.69101: _execute() done 30582 1726855326.69105: dumping result to json 30582 1726855326.69107: done dumping result, returning 30582 1726855326.69116: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-00000000128b] 30582 1726855326.69119: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128b 30582 1726855326.69202: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128b 30582 1726855326.69205: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855326.69276: no more pending results, returning what we have 30582 1726855326.69280: results queue empty 30582 1726855326.69281: checking for any_errors_fatal 30582 1726855326.69306: done checking for any_errors_fatal 30582 1726855326.69308: checking for max_fail_percentage 30582 1726855326.69589: done checking for max_fail_percentage 30582 1726855326.69591: checking to see if all hosts have failed and the running result is not ok 30582 1726855326.69592: done checking to see if all hosts have failed 30582 1726855326.69593: getting the remaining hosts for this loop 30582 1726855326.69594: done getting the remaining hosts for this loop 30582 1726855326.69598: getting the next task for host managed_node3 30582 1726855326.69605: done getting next task for host managed_node3 30582 1726855326.69609: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855326.69614: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855326.69634: getting variables 30582 1726855326.69636: in VariableManager get_vars() 30582 1726855326.69667: Calling all_inventory to load vars for managed_node3 30582 1726855326.69670: Calling groups_inventory to load vars for managed_node3 30582 1726855326.69672: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855326.69681: Calling all_plugins_play to load vars for managed_node3 30582 1726855326.69684: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855326.69694: Calling groups_plugins_play to load vars for managed_node3 30582 1726855326.71573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855326.73012: done with get_vars() 30582 1726855326.73035: done getting variables 30582 1726855326.73088: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:06 -0400 (0:00:00.054) 0:01:03.081 ****** 30582 1726855326.73125: entering _queue_task() for managed_node3/debug 30582 1726855326.73455: worker is 1 (out of 1 available) 30582 1726855326.73468: exiting _queue_task() for managed_node3/debug 30582 1726855326.73480: done queuing things up, now waiting for results queue to drain 30582 1726855326.73482: waiting for pending results... 30582 1726855326.73817: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855326.73929: in run() - task 0affcc66-ac2b-aa83-7d57-00000000128c 30582 1726855326.74094: variable 'ansible_search_path' from source: unknown 30582 1726855326.74097: variable 'ansible_search_path' from source: unknown 30582 1726855326.74100: calling self._execute() 30582 1726855326.74103: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.74105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.74111: variable 'omit' from source: magic vars 30582 1726855326.74493: variable 'ansible_distribution_major_version' from source: facts 30582 1726855326.74510: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855326.74526: variable 'omit' from source: magic vars 30582 1726855326.74593: variable 'omit' from source: magic vars 30582 1726855326.74631: variable 'omit' from source: magic vars 30582 1726855326.74689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855326.74734: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855326.74771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855326.74801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855326.74818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855326.74859: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855326.74873: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.74882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.75000: Set connection var ansible_timeout to 10 30582 1726855326.75092: Set connection var ansible_connection to ssh 30582 1726855326.75099: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855326.75102: Set connection var ansible_pipelining to False 30582 1726855326.75104: Set connection var ansible_shell_executable to /bin/sh 30582 1726855326.75106: Set connection var ansible_shell_type to sh 30582 1726855326.75352: variable 'ansible_shell_executable' from source: unknown 30582 1726855326.75355: variable 'ansible_connection' from source: unknown 30582 1726855326.75358: variable 'ansible_module_compression' from source: unknown 30582 1726855326.75360: variable 'ansible_shell_type' from source: unknown 30582 1726855326.75362: variable 'ansible_shell_executable' from source: unknown 30582 1726855326.75364: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.75366: variable 'ansible_pipelining' from source: unknown 30582 1726855326.75368: variable 'ansible_timeout' from source: unknown 30582 1726855326.75369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.75705: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855326.75709: variable 'omit' from source: magic vars 30582 1726855326.75712: starting attempt loop 30582 1726855326.75714: running the handler 30582 1726855326.75946: variable '__network_connections_result' from source: set_fact 30582 1726855326.76194: handler run complete 30582 1726855326.76198: attempt loop complete, returning result 30582 1726855326.76200: _execute() done 30582 1726855326.76202: dumping result to json 30582 1726855326.76204: done dumping result, returning 30582 1726855326.76207: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-00000000128c] 30582 1726855326.76209: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128c 30582 1726855326.76281: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128c 30582 1726855326.76290: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0 skipped because already active" ] } 30582 1726855326.76369: no more pending results, returning what we have 30582 1726855326.76374: results queue empty 30582 1726855326.76375: checking for any_errors_fatal 30582 1726855326.76380: done checking for any_errors_fatal 30582 1726855326.76381: checking for max_fail_percentage 30582 1726855326.76383: done checking for max_fail_percentage 30582 1726855326.76384: checking to see if all hosts have failed and the running result is not ok 30582 1726855326.76385: done checking to see if all hosts have failed 30582 1726855326.76385: getting the remaining hosts for this loop 30582 1726855326.76389: done getting the remaining hosts for this loop 30582 1726855326.76393: getting the next task for host managed_node3 30582 1726855326.76402: done getting next task for host managed_node3 30582 1726855326.76406: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855326.76412: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855326.76426: getting variables 30582 1726855326.76427: in VariableManager get_vars() 30582 1726855326.76466: Calling all_inventory to load vars for managed_node3 30582 1726855326.76469: Calling groups_inventory to load vars for managed_node3 30582 1726855326.76471: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855326.76483: Calling all_plugins_play to load vars for managed_node3 30582 1726855326.76486: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855326.76899: Calling groups_plugins_play to load vars for managed_node3 30582 1726855326.78505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855326.80851: done with get_vars() 30582 1726855326.80883: done getting variables 30582 1726855326.81031: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:06 -0400 (0:00:00.079) 0:01:03.160 ****** 30582 1726855326.81077: entering _queue_task() for managed_node3/debug 30582 1726855326.81456: worker is 1 (out of 1 available) 30582 1726855326.81470: exiting _queue_task() for managed_node3/debug 30582 1726855326.81483: done queuing things up, now waiting for results queue to drain 30582 1726855326.81484: waiting for pending results... 30582 1726855326.81793: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855326.82093: in run() - task 0affcc66-ac2b-aa83-7d57-00000000128d 30582 1726855326.82098: variable 'ansible_search_path' from source: unknown 30582 1726855326.82100: variable 'ansible_search_path' from source: unknown 30582 1726855326.82103: calling self._execute() 30582 1726855326.82106: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.82152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.82166: variable 'omit' from source: magic vars 30582 1726855326.82557: variable 'ansible_distribution_major_version' from source: facts 30582 1726855326.82579: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855326.82595: variable 'omit' from source: magic vars 30582 1726855326.82663: variable 'omit' from source: magic vars 30582 1726855326.82709: variable 'omit' from source: magic vars 30582 1726855326.82755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855326.82798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855326.82822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855326.82842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855326.82857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855326.82898: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855326.82910: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.82918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.83122: Set connection var ansible_timeout to 10 30582 1726855326.83125: Set connection var ansible_connection to ssh 30582 1726855326.83127: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855326.83129: Set connection var ansible_pipelining to False 30582 1726855326.83131: Set connection var ansible_shell_executable to /bin/sh 30582 1726855326.83132: Set connection var ansible_shell_type to sh 30582 1726855326.83134: variable 'ansible_shell_executable' from source: unknown 30582 1726855326.83136: variable 'ansible_connection' from source: unknown 30582 1726855326.83138: variable 'ansible_module_compression' from source: unknown 30582 1726855326.83140: variable 'ansible_shell_type' from source: unknown 30582 1726855326.83142: variable 'ansible_shell_executable' from source: unknown 30582 1726855326.83143: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.83145: variable 'ansible_pipelining' from source: unknown 30582 1726855326.83147: variable 'ansible_timeout' from source: unknown 30582 1726855326.83149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.83264: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855326.83279: variable 'omit' from source: magic vars 30582 1726855326.83290: starting attempt loop 30582 1726855326.83297: running the handler 30582 1726855326.83354: variable '__network_connections_result' from source: set_fact 30582 1726855326.83438: variable '__network_connections_result' from source: set_fact 30582 1726855326.83557: handler run complete 30582 1726855326.83589: attempt loop complete, returning result 30582 1726855326.83597: _execute() done 30582 1726855326.83603: dumping result to json 30582 1726855326.83613: done dumping result, returning 30582 1726855326.83627: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-00000000128d] 30582 1726855326.83664: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128d ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2e08db44-6b45-462b-a24b-1e1d0b41e5c0 skipped because already active" ] } } 30582 1726855326.83971: no more pending results, returning what we have 30582 1726855326.83975: results queue empty 30582 1726855326.83976: checking for any_errors_fatal 30582 1726855326.83984: done checking for any_errors_fatal 30582 1726855326.83985: checking for max_fail_percentage 30582 1726855326.83990: done checking for max_fail_percentage 30582 1726855326.83993: checking to see if all hosts have failed and the running result is not ok 30582 1726855326.83994: done checking to see if all hosts have failed 30582 1726855326.83995: getting the remaining hosts for this loop 30582 1726855326.83997: done getting the remaining hosts for this loop 30582 1726855326.84001: getting the next task for host managed_node3 30582 1726855326.84014: done getting next task for host managed_node3 30582 1726855326.84019: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855326.84024: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855326.84102: getting variables 30582 1726855326.84104: in VariableManager get_vars() 30582 1726855326.84143: Calling all_inventory to load vars for managed_node3 30582 1726855326.84146: Calling groups_inventory to load vars for managed_node3 30582 1726855326.84148: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855326.84160: Calling all_plugins_play to load vars for managed_node3 30582 1726855326.84169: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855326.84173: Calling groups_plugins_play to load vars for managed_node3 30582 1726855326.84802: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128d 30582 1726855326.84805: WORKER PROCESS EXITING 30582 1726855326.85880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855326.87425: done with get_vars() 30582 1726855326.87453: done getting variables 30582 1726855326.87519: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:06 -0400 (0:00:00.064) 0:01:03.225 ****** 30582 1726855326.87555: entering _queue_task() for managed_node3/debug 30582 1726855326.87948: worker is 1 (out of 1 available) 30582 1726855326.87961: exiting _queue_task() for managed_node3/debug 30582 1726855326.87975: done queuing things up, now waiting for results queue to drain 30582 1726855326.87977: waiting for pending results... 30582 1726855326.88280: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855326.88428: in run() - task 0affcc66-ac2b-aa83-7d57-00000000128e 30582 1726855326.88450: variable 'ansible_search_path' from source: unknown 30582 1726855326.88458: variable 'ansible_search_path' from source: unknown 30582 1726855326.88499: calling self._execute() 30582 1726855326.88600: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.88611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.88629: variable 'omit' from source: magic vars 30582 1726855326.89014: variable 'ansible_distribution_major_version' from source: facts 30582 1726855326.89031: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855326.89176: variable 'network_state' from source: role '' defaults 30582 1726855326.89197: Evaluated conditional (network_state != {}): False 30582 1726855326.89206: when evaluation is False, skipping this task 30582 1726855326.89216: _execute() done 30582 1726855326.89223: dumping result to json 30582 1726855326.89229: done dumping result, returning 30582 1726855326.89241: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-00000000128e] 30582 1726855326.89249: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128e 30582 1726855326.89535: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128e 30582 1726855326.89539: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855326.89583: no more pending results, returning what we have 30582 1726855326.89589: results queue empty 30582 1726855326.89590: checking for any_errors_fatal 30582 1726855326.89601: done checking for any_errors_fatal 30582 1726855326.89602: checking for max_fail_percentage 30582 1726855326.89604: done checking for max_fail_percentage 30582 1726855326.89605: checking to see if all hosts have failed and the running result is not ok 30582 1726855326.89605: done checking to see if all hosts have failed 30582 1726855326.89606: getting the remaining hosts for this loop 30582 1726855326.89607: done getting the remaining hosts for this loop 30582 1726855326.89611: getting the next task for host managed_node3 30582 1726855326.89619: done getting next task for host managed_node3 30582 1726855326.89623: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855326.89629: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855326.89653: getting variables 30582 1726855326.89655: in VariableManager get_vars() 30582 1726855326.89698: Calling all_inventory to load vars for managed_node3 30582 1726855326.89705: Calling groups_inventory to load vars for managed_node3 30582 1726855326.89708: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855326.89721: Calling all_plugins_play to load vars for managed_node3 30582 1726855326.89724: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855326.89727: Calling groups_plugins_play to load vars for managed_node3 30582 1726855326.91256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855326.92823: done with get_vars() 30582 1726855326.92849: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:06 -0400 (0:00:00.053) 0:01:03.279 ****** 30582 1726855326.92951: entering _queue_task() for managed_node3/ping 30582 1726855326.93510: worker is 1 (out of 1 available) 30582 1726855326.93520: exiting _queue_task() for managed_node3/ping 30582 1726855326.93530: done queuing things up, now waiting for results queue to drain 30582 1726855326.93532: waiting for pending results... 30582 1726855326.93651: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855326.93808: in run() - task 0affcc66-ac2b-aa83-7d57-00000000128f 30582 1726855326.93832: variable 'ansible_search_path' from source: unknown 30582 1726855326.93845: variable 'ansible_search_path' from source: unknown 30582 1726855326.93894: calling self._execute() 30582 1726855326.93999: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.94020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.94131: variable 'omit' from source: magic vars 30582 1726855326.94426: variable 'ansible_distribution_major_version' from source: facts 30582 1726855326.94443: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855326.94459: variable 'omit' from source: magic vars 30582 1726855326.94525: variable 'omit' from source: magic vars 30582 1726855326.94565: variable 'omit' from source: magic vars 30582 1726855326.94609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855326.94647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855326.94680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855326.94705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855326.94722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855326.94759: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855326.94771: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.94784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.94896: Set connection var ansible_timeout to 10 30582 1726855326.94904: Set connection var ansible_connection to ssh 30582 1726855326.94916: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855326.94924: Set connection var ansible_pipelining to False 30582 1726855326.94933: Set connection var ansible_shell_executable to /bin/sh 30582 1726855326.94939: Set connection var ansible_shell_type to sh 30582 1726855326.94965: variable 'ansible_shell_executable' from source: unknown 30582 1726855326.94993: variable 'ansible_connection' from source: unknown 30582 1726855326.94997: variable 'ansible_module_compression' from source: unknown 30582 1726855326.94999: variable 'ansible_shell_type' from source: unknown 30582 1726855326.95001: variable 'ansible_shell_executable' from source: unknown 30582 1726855326.95003: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855326.95102: variable 'ansible_pipelining' from source: unknown 30582 1726855326.95105: variable 'ansible_timeout' from source: unknown 30582 1726855326.95108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855326.95246: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855326.95265: variable 'omit' from source: magic vars 30582 1726855326.95280: starting attempt loop 30582 1726855326.95289: running the handler 30582 1726855326.95310: _low_level_execute_command(): starting 30582 1726855326.95329: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855326.96102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855326.96182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855326.96222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855326.96324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855326.98055: stdout chunk (state=3): >>>/root <<< 30582 1726855326.98393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855326.98397: stdout chunk (state=3): >>><<< 30582 1726855326.98400: stderr chunk (state=3): >>><<< 30582 1726855326.98402: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855326.98405: _low_level_execute_command(): starting 30582 1726855326.98408: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423 `" && echo ansible-tmp-1726855326.9831352-33546-128278695960423="` echo /root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423 `" ) && sleep 0' 30582 1726855326.99005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855326.99022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855326.99111: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855326.99152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855326.99171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855326.99199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855326.99341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855327.01296: stdout chunk (state=3): >>>ansible-tmp-1726855326.9831352-33546-128278695960423=/root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423 <<< 30582 1726855327.01593: stdout chunk (state=3): >>><<< 30582 1726855327.01597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855327.01599: stderr chunk (state=3): >>><<< 30582 1726855327.01602: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855326.9831352-33546-128278695960423=/root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855327.01604: variable 'ansible_module_compression' from source: unknown 30582 1726855327.01648: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855327.01698: variable 'ansible_facts' from source: unknown 30582 1726855327.01803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/AnsiballZ_ping.py 30582 1726855327.02015: Sending initial data 30582 1726855327.02018: Sent initial data (153 bytes) 30582 1726855327.02630: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855327.02644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855327.02704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855327.02763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855327.02786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855327.02814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855327.02900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855327.04461: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855327.04539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855327.04596: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpjj6zau_u /root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/AnsiballZ_ping.py <<< 30582 1726855327.04601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/AnsiballZ_ping.py" <<< 30582 1726855327.04673: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpjj6zau_u" to remote "/root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/AnsiballZ_ping.py" <<< 30582 1726855327.05400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855327.05439: stderr chunk (state=3): >>><<< 30582 1726855327.05442: stdout chunk (state=3): >>><<< 30582 1726855327.05464: done transferring module to remote 30582 1726855327.05477: _low_level_execute_command(): starting 30582 1726855327.05480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/ /root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/AnsiballZ_ping.py && sleep 0' 30582 1726855327.06022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855327.06026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855327.06029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855327.06041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855327.06111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855327.06115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855327.06197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855327.08002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855327.08005: stderr chunk (state=3): >>><<< 30582 1726855327.08008: stdout chunk (state=3): >>><<< 30582 1726855327.08017: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855327.08020: _low_level_execute_command(): starting 30582 1726855327.08022: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/AnsiballZ_ping.py && sleep 0' 30582 1726855327.08467: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855327.08472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855327.08484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855327.08545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855327.08551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855327.08617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855327.23693: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855327.25099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855327.25103: stdout chunk (state=3): >>><<< 30582 1726855327.25106: stderr chunk (state=3): >>><<< 30582 1726855327.25154: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855327.25295: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855327.25299: _low_level_execute_command(): starting 30582 1726855327.25301: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855326.9831352-33546-128278695960423/ > /dev/null 2>&1 && sleep 0' 30582 1726855327.26309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855327.26361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855327.26381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855327.26412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855327.26570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855327.28497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855327.28501: stdout chunk (state=3): >>><<< 30582 1726855327.28503: stderr chunk (state=3): >>><<< 30582 1726855327.28505: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855327.28511: handler run complete 30582 1726855327.28513: attempt loop complete, returning result 30582 1726855327.28514: _execute() done 30582 1726855327.28517: dumping result to json 30582 1726855327.28524: done dumping result, returning 30582 1726855327.28526: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-00000000128f] 30582 1726855327.28528: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128f 30582 1726855327.28954: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000128f 30582 1726855327.28957: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855327.29109: no more pending results, returning what we have 30582 1726855327.29118: results queue empty 30582 1726855327.29119: checking for any_errors_fatal 30582 1726855327.29135: done checking for any_errors_fatal 30582 1726855327.29139: checking for max_fail_percentage 30582 1726855327.29142: done checking for max_fail_percentage 30582 1726855327.29146: checking to see if all hosts have failed and the running result is not ok 30582 1726855327.29147: done checking to see if all hosts have failed 30582 1726855327.29148: getting the remaining hosts for this loop 30582 1726855327.29153: done getting the remaining hosts for this loop 30582 1726855327.29161: getting the next task for host managed_node3 30582 1726855327.29184: done getting next task for host managed_node3 30582 1726855327.29229: ^ task is: TASK: meta (role_complete) 30582 1726855327.29236: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855327.29252: getting variables 30582 1726855327.29254: in VariableManager get_vars() 30582 1726855327.29312: Calling all_inventory to load vars for managed_node3 30582 1726855327.29316: Calling groups_inventory to load vars for managed_node3 30582 1726855327.29342: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855327.29365: Calling all_plugins_play to load vars for managed_node3 30582 1726855327.29372: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855327.29380: Calling groups_plugins_play to load vars for managed_node3 30582 1726855327.31781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855327.32750: done with get_vars() 30582 1726855327.32769: done getting variables 30582 1726855327.32838: done queuing things up, now waiting for results queue to drain 30582 1726855327.32840: results queue empty 30582 1726855327.32841: checking for any_errors_fatal 30582 1726855327.32844: done checking for any_errors_fatal 30582 1726855327.32845: checking for max_fail_percentage 30582 1726855327.32846: done checking for max_fail_percentage 30582 1726855327.32847: checking to see if all hosts have failed and the running result is not ok 30582 1726855327.32847: done checking to see if all hosts have failed 30582 1726855327.32848: getting the remaining hosts for this loop 30582 1726855327.32849: done getting the remaining hosts for this loop 30582 1726855327.32852: getting the next task for host managed_node3 30582 1726855327.32858: done getting next task for host managed_node3 30582 1726855327.32860: ^ task is: TASK: Test 30582 1726855327.32864: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855327.32867: getting variables 30582 1726855327.32867: in VariableManager get_vars() 30582 1726855327.32877: Calling all_inventory to load vars for managed_node3 30582 1726855327.32879: Calling groups_inventory to load vars for managed_node3 30582 1726855327.32881: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855327.32884: Calling all_plugins_play to load vars for managed_node3 30582 1726855327.32886: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855327.32889: Calling groups_plugins_play to load vars for managed_node3 30582 1726855327.33868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855327.36676: done with get_vars() 30582 1726855327.36700: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 14:02:07 -0400 (0:00:00.438) 0:01:03.717 ****** 30582 1726855327.36789: entering _queue_task() for managed_node3/include_tasks 30582 1726855327.37713: worker is 1 (out of 1 available) 30582 1726855327.37727: exiting _queue_task() for managed_node3/include_tasks 30582 1726855327.37741: done queuing things up, now waiting for results queue to drain 30582 1726855327.37743: waiting for pending results... 30582 1726855327.38553: running TaskExecutor() for managed_node3/TASK: Test 30582 1726855327.38559: in run() - task 0affcc66-ac2b-aa83-7d57-000000001009 30582 1726855327.38674: variable 'ansible_search_path' from source: unknown 30582 1726855327.38679: variable 'ansible_search_path' from source: unknown 30582 1726855327.38794: variable 'lsr_test' from source: include params 30582 1726855327.39229: variable 'lsr_test' from source: include params 30582 1726855327.39423: variable 'omit' from source: magic vars 30582 1726855327.39684: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855327.39694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855327.39821: variable 'omit' from source: magic vars 30582 1726855327.40401: variable 'ansible_distribution_major_version' from source: facts 30582 1726855327.40442: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855327.40446: variable 'item' from source: unknown 30582 1726855327.40506: variable 'item' from source: unknown 30582 1726855327.40540: variable 'item' from source: unknown 30582 1726855327.40781: variable 'item' from source: unknown 30582 1726855327.41024: dumping result to json 30582 1726855327.41027: done dumping result, returning 30582 1726855327.41029: done running TaskExecutor() for managed_node3/TASK: Test [0affcc66-ac2b-aa83-7d57-000000001009] 30582 1726855327.41031: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001009 30582 1726855327.41071: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001009 30582 1726855327.41074: WORKER PROCESS EXITING 30582 1726855327.41153: no more pending results, returning what we have 30582 1726855327.41159: in VariableManager get_vars() 30582 1726855327.41335: Calling all_inventory to load vars for managed_node3 30582 1726855327.41340: Calling groups_inventory to load vars for managed_node3 30582 1726855327.41344: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855327.41358: Calling all_plugins_play to load vars for managed_node3 30582 1726855327.41362: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855327.41365: Calling groups_plugins_play to load vars for managed_node3 30582 1726855327.43035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855327.44018: done with get_vars() 30582 1726855327.44043: variable 'ansible_search_path' from source: unknown 30582 1726855327.44044: variable 'ansible_search_path' from source: unknown 30582 1726855327.44089: we have included files to process 30582 1726855327.44091: generating all_blocks data 30582 1726855327.44093: done generating all_blocks data 30582 1726855327.44101: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30582 1726855327.44102: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30582 1726855327.44105: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30582 1726855327.44310: done processing included file 30582 1726855327.44312: iterating over new_blocks loaded from include file 30582 1726855327.44314: in VariableManager get_vars() 30582 1726855327.44331: done with get_vars() 30582 1726855327.44332: filtering new block on tags 30582 1726855327.44359: done filtering new block on tags 30582 1726855327.44361: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node3 => (item=tasks/remove_profile.yml) 30582 1726855327.44366: extending task lists for all hosts with included blocks 30582 1726855327.45126: done extending task lists 30582 1726855327.45127: done processing included files 30582 1726855327.45127: results queue empty 30582 1726855327.45128: checking for any_errors_fatal 30582 1726855327.45129: done checking for any_errors_fatal 30582 1726855327.45130: checking for max_fail_percentage 30582 1726855327.45130: done checking for max_fail_percentage 30582 1726855327.45131: checking to see if all hosts have failed and the running result is not ok 30582 1726855327.45131: done checking to see if all hosts have failed 30582 1726855327.45132: getting the remaining hosts for this loop 30582 1726855327.45133: done getting the remaining hosts for this loop 30582 1726855327.45134: getting the next task for host managed_node3 30582 1726855327.45138: done getting next task for host managed_node3 30582 1726855327.45139: ^ task is: TASK: Include network role 30582 1726855327.45141: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855327.45143: getting variables 30582 1726855327.45143: in VariableManager get_vars() 30582 1726855327.45151: Calling all_inventory to load vars for managed_node3 30582 1726855327.45153: Calling groups_inventory to load vars for managed_node3 30582 1726855327.45154: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855327.45158: Calling all_plugins_play to load vars for managed_node3 30582 1726855327.45160: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855327.45162: Calling groups_plugins_play to load vars for managed_node3 30582 1726855327.52577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855327.56058: done with get_vars() 30582 1726855327.56152: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 14:02:07 -0400 (0:00:00.194) 0:01:03.912 ****** 30582 1726855327.56235: entering _queue_task() for managed_node3/include_role 30582 1726855327.57301: worker is 1 (out of 1 available) 30582 1726855327.57316: exiting _queue_task() for managed_node3/include_role 30582 1726855327.57329: done queuing things up, now waiting for results queue to drain 30582 1726855327.57331: waiting for pending results... 30582 1726855327.57912: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855327.58294: in run() - task 0affcc66-ac2b-aa83-7d57-0000000013e8 30582 1726855327.58300: variable 'ansible_search_path' from source: unknown 30582 1726855327.58303: variable 'ansible_search_path' from source: unknown 30582 1726855327.58307: calling self._execute() 30582 1726855327.58619: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855327.58628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855327.58637: variable 'omit' from source: magic vars 30582 1726855327.59399: variable 'ansible_distribution_major_version' from source: facts 30582 1726855327.59403: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855327.59406: _execute() done 30582 1726855327.59409: dumping result to json 30582 1726855327.59412: done dumping result, returning 30582 1726855327.59415: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-0000000013e8] 30582 1726855327.59417: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000013e8 30582 1726855327.59509: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000013e8 30582 1726855327.59513: WORKER PROCESS EXITING 30582 1726855327.59546: no more pending results, returning what we have 30582 1726855327.59552: in VariableManager get_vars() 30582 1726855327.59601: Calling all_inventory to load vars for managed_node3 30582 1726855327.59605: Calling groups_inventory to load vars for managed_node3 30582 1726855327.59609: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855327.59624: Calling all_plugins_play to load vars for managed_node3 30582 1726855327.59628: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855327.59631: Calling groups_plugins_play to load vars for managed_node3 30582 1726855327.64377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855327.67219: done with get_vars() 30582 1726855327.67325: variable 'ansible_search_path' from source: unknown 30582 1726855327.67326: variable 'ansible_search_path' from source: unknown 30582 1726855327.67680: variable 'omit' from source: magic vars 30582 1726855327.67845: variable 'omit' from source: magic vars 30582 1726855327.67862: variable 'omit' from source: magic vars 30582 1726855327.67866: we have included files to process 30582 1726855327.67867: generating all_blocks data 30582 1726855327.67871: done generating all_blocks data 30582 1726855327.67873: processing included file: fedora.linux_system_roles.network 30582 1726855327.67899: in VariableManager get_vars() 30582 1726855327.67916: done with get_vars() 30582 1726855327.67946: in VariableManager get_vars() 30582 1726855327.68085: done with get_vars() 30582 1726855327.68133: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855327.68367: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855327.68485: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855327.69014: in VariableManager get_vars() 30582 1726855327.69034: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855327.72533: iterating over new_blocks loaded from include file 30582 1726855327.72536: in VariableManager get_vars() 30582 1726855327.72563: done with get_vars() 30582 1726855327.72566: filtering new block on tags 30582 1726855327.72991: done filtering new block on tags 30582 1726855327.72995: in VariableManager get_vars() 30582 1726855327.73013: done with get_vars() 30582 1726855327.73015: filtering new block on tags 30582 1726855327.73039: done filtering new block on tags 30582 1726855327.73041: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855327.73047: extending task lists for all hosts with included blocks 30582 1726855327.73167: done extending task lists 30582 1726855327.73169: done processing included files 30582 1726855327.73170: results queue empty 30582 1726855327.73170: checking for any_errors_fatal 30582 1726855327.73174: done checking for any_errors_fatal 30582 1726855327.73175: checking for max_fail_percentage 30582 1726855327.73176: done checking for max_fail_percentage 30582 1726855327.73177: checking to see if all hosts have failed and the running result is not ok 30582 1726855327.73178: done checking to see if all hosts have failed 30582 1726855327.73179: getting the remaining hosts for this loop 30582 1726855327.73180: done getting the remaining hosts for this loop 30582 1726855327.73183: getting the next task for host managed_node3 30582 1726855327.73189: done getting next task for host managed_node3 30582 1726855327.73192: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855327.73195: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855327.73206: getting variables 30582 1726855327.73207: in VariableManager get_vars() 30582 1726855327.73221: Calling all_inventory to load vars for managed_node3 30582 1726855327.73224: Calling groups_inventory to load vars for managed_node3 30582 1726855327.73226: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855327.73232: Calling all_plugins_play to load vars for managed_node3 30582 1726855327.73234: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855327.73237: Calling groups_plugins_play to load vars for managed_node3 30582 1726855327.74584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855327.77012: done with get_vars() 30582 1726855327.77045: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:07 -0400 (0:00:00.208) 0:01:04.121 ****** 30582 1726855327.77134: entering _queue_task() for managed_node3/include_tasks 30582 1726855327.77552: worker is 1 (out of 1 available) 30582 1726855327.77571: exiting _queue_task() for managed_node3/include_tasks 30582 1726855327.77584: done queuing things up, now waiting for results queue to drain 30582 1726855327.77586: waiting for pending results... 30582 1726855327.78094: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855327.78133: in run() - task 0affcc66-ac2b-aa83-7d57-00000000145f 30582 1726855327.78242: variable 'ansible_search_path' from source: unknown 30582 1726855327.78247: variable 'ansible_search_path' from source: unknown 30582 1726855327.78251: calling self._execute() 30582 1726855327.78261: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855327.78265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855327.78269: variable 'omit' from source: magic vars 30582 1726855327.78775: variable 'ansible_distribution_major_version' from source: facts 30582 1726855327.78779: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855327.78781: _execute() done 30582 1726855327.78784: dumping result to json 30582 1726855327.78786: done dumping result, returning 30582 1726855327.78790: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-00000000145f] 30582 1726855327.78792: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000145f 30582 1726855327.78874: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000145f 30582 1726855327.78932: no more pending results, returning what we have 30582 1726855327.78939: in VariableManager get_vars() 30582 1726855327.78990: Calling all_inventory to load vars for managed_node3 30582 1726855327.78994: Calling groups_inventory to load vars for managed_node3 30582 1726855327.78997: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855327.79014: Calling all_plugins_play to load vars for managed_node3 30582 1726855327.79018: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855327.79022: Calling groups_plugins_play to load vars for managed_node3 30582 1726855327.79548: WORKER PROCESS EXITING 30582 1726855327.81245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855327.83462: done with get_vars() 30582 1726855327.83502: variable 'ansible_search_path' from source: unknown 30582 1726855327.83503: variable 'ansible_search_path' from source: unknown 30582 1726855327.83557: we have included files to process 30582 1726855327.83559: generating all_blocks data 30582 1726855327.83561: done generating all_blocks data 30582 1726855327.83564: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855327.83565: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855327.83568: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855327.84734: done processing included file 30582 1726855327.84985: iterating over new_blocks loaded from include file 30582 1726855327.84988: in VariableManager get_vars() 30582 1726855327.85022: done with get_vars() 30582 1726855327.85024: filtering new block on tags 30582 1726855327.85072: done filtering new block on tags 30582 1726855327.85076: in VariableManager get_vars() 30582 1726855327.85109: done with get_vars() 30582 1726855327.85111: filtering new block on tags 30582 1726855327.85167: done filtering new block on tags 30582 1726855327.85172: in VariableManager get_vars() 30582 1726855327.85367: done with get_vars() 30582 1726855327.85372: filtering new block on tags 30582 1726855327.85656: done filtering new block on tags 30582 1726855327.85659: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855327.85665: extending task lists for all hosts with included blocks 30582 1726855327.89668: done extending task lists 30582 1726855327.89672: done processing included files 30582 1726855327.89673: results queue empty 30582 1726855327.89678: checking for any_errors_fatal 30582 1726855327.89681: done checking for any_errors_fatal 30582 1726855327.89682: checking for max_fail_percentage 30582 1726855327.89683: done checking for max_fail_percentage 30582 1726855327.89684: checking to see if all hosts have failed and the running result is not ok 30582 1726855327.89685: done checking to see if all hosts have failed 30582 1726855327.89686: getting the remaining hosts for this loop 30582 1726855327.89688: done getting the remaining hosts for this loop 30582 1726855327.89691: getting the next task for host managed_node3 30582 1726855327.89706: done getting next task for host managed_node3 30582 1726855327.89709: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855327.89713: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855327.89724: getting variables 30582 1726855327.89726: in VariableManager get_vars() 30582 1726855327.89749: Calling all_inventory to load vars for managed_node3 30582 1726855327.89751: Calling groups_inventory to load vars for managed_node3 30582 1726855327.89754: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855327.89760: Calling all_plugins_play to load vars for managed_node3 30582 1726855327.89763: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855327.89766: Calling groups_plugins_play to load vars for managed_node3 30582 1726855327.92364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855327.97051: done with get_vars() 30582 1726855327.97097: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:02:07 -0400 (0:00:00.201) 0:01:04.322 ****** 30582 1726855327.97304: entering _queue_task() for managed_node3/setup 30582 1726855327.98398: worker is 1 (out of 1 available) 30582 1726855327.98409: exiting _queue_task() for managed_node3/setup 30582 1726855327.98421: done queuing things up, now waiting for results queue to drain 30582 1726855327.98423: waiting for pending results... 30582 1726855327.98866: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855327.99147: in run() - task 0affcc66-ac2b-aa83-7d57-0000000014b6 30582 1726855327.99151: variable 'ansible_search_path' from source: unknown 30582 1726855327.99154: variable 'ansible_search_path' from source: unknown 30582 1726855327.99157: calling self._execute() 30582 1726855327.99231: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855327.99243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855327.99267: variable 'omit' from source: magic vars 30582 1726855327.99692: variable 'ansible_distribution_major_version' from source: facts 30582 1726855327.99798: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855328.00059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855328.03781: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855328.03872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855328.03923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855328.03978: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855328.04066: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855328.04105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855328.04141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855328.04181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855328.04233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855328.04252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855328.04328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855328.04358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855328.04399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855328.04492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855328.04500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855328.04664: variable '__network_required_facts' from source: role '' defaults 30582 1726855328.04684: variable 'ansible_facts' from source: unknown 30582 1726855328.05547: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855328.05556: when evaluation is False, skipping this task 30582 1726855328.05564: _execute() done 30582 1726855328.05574: dumping result to json 30582 1726855328.05691: done dumping result, returning 30582 1726855328.05698: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-0000000014b6] 30582 1726855328.05701: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014b6 30582 1726855328.05776: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014b6 30582 1726855328.05779: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855328.05827: no more pending results, returning what we have 30582 1726855328.05831: results queue empty 30582 1726855328.05833: checking for any_errors_fatal 30582 1726855328.05834: done checking for any_errors_fatal 30582 1726855328.05835: checking for max_fail_percentage 30582 1726855328.05837: done checking for max_fail_percentage 30582 1726855328.05838: checking to see if all hosts have failed and the running result is not ok 30582 1726855328.05839: done checking to see if all hosts have failed 30582 1726855328.05839: getting the remaining hosts for this loop 30582 1726855328.05841: done getting the remaining hosts for this loop 30582 1726855328.05845: getting the next task for host managed_node3 30582 1726855328.05859: done getting next task for host managed_node3 30582 1726855328.05863: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855328.05872: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855328.05896: getting variables 30582 1726855328.05898: in VariableManager get_vars() 30582 1726855328.05940: Calling all_inventory to load vars for managed_node3 30582 1726855328.05943: Calling groups_inventory to load vars for managed_node3 30582 1726855328.05946: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855328.05957: Calling all_plugins_play to load vars for managed_node3 30582 1726855328.05961: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855328.05973: Calling groups_plugins_play to load vars for managed_node3 30582 1726855328.08747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855328.12358: done with get_vars() 30582 1726855328.12394: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:02:08 -0400 (0:00:00.152) 0:01:04.475 ****** 30582 1726855328.12515: entering _queue_task() for managed_node3/stat 30582 1726855328.13075: worker is 1 (out of 1 available) 30582 1726855328.13092: exiting _queue_task() for managed_node3/stat 30582 1726855328.13103: done queuing things up, now waiting for results queue to drain 30582 1726855328.13104: waiting for pending results... 30582 1726855328.14310: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855328.14316: in run() - task 0affcc66-ac2b-aa83-7d57-0000000014b8 30582 1726855328.14319: variable 'ansible_search_path' from source: unknown 30582 1726855328.14322: variable 'ansible_search_path' from source: unknown 30582 1726855328.14325: calling self._execute() 30582 1726855328.14471: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855328.14484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855328.14517: variable 'omit' from source: magic vars 30582 1726855328.15465: variable 'ansible_distribution_major_version' from source: facts 30582 1726855328.15523: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855328.15913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855328.16412: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855328.16471: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855328.16518: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855328.16567: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855328.16667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855328.16703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855328.16734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855328.16773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855328.16877: variable '__network_is_ostree' from source: set_fact 30582 1726855328.16893: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855328.16902: when evaluation is False, skipping this task 30582 1726855328.16911: _execute() done 30582 1726855328.16918: dumping result to json 30582 1726855328.16926: done dumping result, returning 30582 1726855328.16938: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-0000000014b8] 30582 1726855328.16949: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014b8 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855328.17117: no more pending results, returning what we have 30582 1726855328.17120: results queue empty 30582 1726855328.17123: checking for any_errors_fatal 30582 1726855328.17133: done checking for any_errors_fatal 30582 1726855328.17133: checking for max_fail_percentage 30582 1726855328.17135: done checking for max_fail_percentage 30582 1726855328.17137: checking to see if all hosts have failed and the running result is not ok 30582 1726855328.17137: done checking to see if all hosts have failed 30582 1726855328.17138: getting the remaining hosts for this loop 30582 1726855328.17140: done getting the remaining hosts for this loop 30582 1726855328.17144: getting the next task for host managed_node3 30582 1726855328.17157: done getting next task for host managed_node3 30582 1726855328.17161: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855328.17166: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855328.17195: getting variables 30582 1726855328.17197: in VariableManager get_vars() 30582 1726855328.17555: Calling all_inventory to load vars for managed_node3 30582 1726855328.17558: Calling groups_inventory to load vars for managed_node3 30582 1726855328.17561: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855328.17568: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014b8 30582 1726855328.17574: WORKER PROCESS EXITING 30582 1726855328.17584: Calling all_plugins_play to load vars for managed_node3 30582 1726855328.17589: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855328.17594: Calling groups_plugins_play to load vars for managed_node3 30582 1726855328.20011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855328.21837: done with get_vars() 30582 1726855328.21862: done getting variables 30582 1726855328.21934: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:02:08 -0400 (0:00:00.094) 0:01:04.569 ****** 30582 1726855328.21977: entering _queue_task() for managed_node3/set_fact 30582 1726855328.22424: worker is 1 (out of 1 available) 30582 1726855328.22436: exiting _queue_task() for managed_node3/set_fact 30582 1726855328.22496: done queuing things up, now waiting for results queue to drain 30582 1726855328.22498: waiting for pending results... 30582 1726855328.22801: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855328.22933: in run() - task 0affcc66-ac2b-aa83-7d57-0000000014b9 30582 1726855328.22953: variable 'ansible_search_path' from source: unknown 30582 1726855328.22961: variable 'ansible_search_path' from source: unknown 30582 1726855328.23011: calling self._execute() 30582 1726855328.23109: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855328.23194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855328.23198: variable 'omit' from source: magic vars 30582 1726855328.23542: variable 'ansible_distribution_major_version' from source: facts 30582 1726855328.23559: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855328.23741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855328.24047: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855328.24200: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855328.24203: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855328.24207: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855328.24277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855328.24317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855328.24347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855328.24381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855328.24486: variable '__network_is_ostree' from source: set_fact 30582 1726855328.24501: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855328.24509: when evaluation is False, skipping this task 30582 1726855328.24522: _execute() done 30582 1726855328.24533: dumping result to json 30582 1726855328.24541: done dumping result, returning 30582 1726855328.24555: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-0000000014b9] 30582 1726855328.24565: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014b9 30582 1726855328.24837: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014b9 30582 1726855328.24840: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855328.24899: no more pending results, returning what we have 30582 1726855328.24904: results queue empty 30582 1726855328.24905: checking for any_errors_fatal 30582 1726855328.24913: done checking for any_errors_fatal 30582 1726855328.24914: checking for max_fail_percentage 30582 1726855328.24916: done checking for max_fail_percentage 30582 1726855328.24917: checking to see if all hosts have failed and the running result is not ok 30582 1726855328.24918: done checking to see if all hosts have failed 30582 1726855328.24919: getting the remaining hosts for this loop 30582 1726855328.24921: done getting the remaining hosts for this loop 30582 1726855328.24925: getting the next task for host managed_node3 30582 1726855328.24938: done getting next task for host managed_node3 30582 1726855328.24942: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855328.24952: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855328.24980: getting variables 30582 1726855328.24983: in VariableManager get_vars() 30582 1726855328.25028: Calling all_inventory to load vars for managed_node3 30582 1726855328.25031: Calling groups_inventory to load vars for managed_node3 30582 1726855328.25033: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855328.25046: Calling all_plugins_play to load vars for managed_node3 30582 1726855328.25049: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855328.25052: Calling groups_plugins_play to load vars for managed_node3 30582 1726855328.26663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855328.28926: done with get_vars() 30582 1726855328.29075: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:02:08 -0400 (0:00:00.073) 0:01:04.642 ****** 30582 1726855328.29303: entering _queue_task() for managed_node3/service_facts 30582 1726855328.30018: worker is 1 (out of 1 available) 30582 1726855328.30031: exiting _queue_task() for managed_node3/service_facts 30582 1726855328.30159: done queuing things up, now waiting for results queue to drain 30582 1726855328.30162: waiting for pending results... 30582 1726855328.30718: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855328.31061: in run() - task 0affcc66-ac2b-aa83-7d57-0000000014bb 30582 1726855328.31138: variable 'ansible_search_path' from source: unknown 30582 1726855328.31142: variable 'ansible_search_path' from source: unknown 30582 1726855328.31184: calling self._execute() 30582 1726855328.31392: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855328.31399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855328.31409: variable 'omit' from source: magic vars 30582 1726855328.32181: variable 'ansible_distribution_major_version' from source: facts 30582 1726855328.32192: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855328.32206: variable 'omit' from source: magic vars 30582 1726855328.32337: variable 'omit' from source: magic vars 30582 1726855328.32340: variable 'omit' from source: magic vars 30582 1726855328.32372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855328.32412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855328.32442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855328.32455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855328.32472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855328.32592: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855328.32596: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855328.32599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855328.32652: Set connection var ansible_timeout to 10 30582 1726855328.32662: Set connection var ansible_connection to ssh 30582 1726855328.32665: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855328.32667: Set connection var ansible_pipelining to False 30582 1726855328.32672: Set connection var ansible_shell_executable to /bin/sh 30582 1726855328.32675: Set connection var ansible_shell_type to sh 30582 1726855328.32718: variable 'ansible_shell_executable' from source: unknown 30582 1726855328.32722: variable 'ansible_connection' from source: unknown 30582 1726855328.32725: variable 'ansible_module_compression' from source: unknown 30582 1726855328.32727: variable 'ansible_shell_type' from source: unknown 30582 1726855328.32730: variable 'ansible_shell_executable' from source: unknown 30582 1726855328.32732: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855328.32734: variable 'ansible_pipelining' from source: unknown 30582 1726855328.32736: variable 'ansible_timeout' from source: unknown 30582 1726855328.32739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855328.33197: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855328.33201: variable 'omit' from source: magic vars 30582 1726855328.33203: starting attempt loop 30582 1726855328.33205: running the handler 30582 1726855328.33207: _low_level_execute_command(): starting 30582 1726855328.33210: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855328.33781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855328.33795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855328.33807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855328.33840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855328.33963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855328.33967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855328.33972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855328.34072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855328.34179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855328.35877: stdout chunk (state=3): >>>/root <<< 30582 1726855328.36008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855328.36014: stdout chunk (state=3): >>><<< 30582 1726855328.36022: stderr chunk (state=3): >>><<< 30582 1726855328.36056: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855328.36072: _low_level_execute_command(): starting 30582 1726855328.36076: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833 `" && echo ansible-tmp-1726855328.3605576-33607-235336355662833="` echo /root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833 `" ) && sleep 0' 30582 1726855328.37465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855328.37480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855328.37589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855328.37656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855328.39634: stdout chunk (state=3): >>>ansible-tmp-1726855328.3605576-33607-235336355662833=/root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833 <<< 30582 1726855328.39728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855328.39779: stderr chunk (state=3): >>><<< 30582 1726855328.39783: stdout chunk (state=3): >>><<< 30582 1726855328.39815: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855328.3605576-33607-235336355662833=/root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855328.39861: variable 'ansible_module_compression' from source: unknown 30582 1726855328.40014: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855328.40128: variable 'ansible_facts' from source: unknown 30582 1726855328.40391: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/AnsiballZ_service_facts.py 30582 1726855328.40613: Sending initial data 30582 1726855328.40616: Sent initial data (162 bytes) 30582 1726855328.41172: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855328.41191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855328.41332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855328.41336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855328.41389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855328.41440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855328.43558: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855328.43594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp4xt8prjv /root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/AnsiballZ_service_facts.py <<< 30582 1726855328.43599: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/AnsiballZ_service_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp4xt8prjv" to remote "/root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/AnsiballZ_service_facts.py" <<< 30582 1726855328.45035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855328.45101: stderr chunk (state=3): >>><<< 30582 1726855328.45193: stdout chunk (state=3): >>><<< 30582 1726855328.45197: done transferring module to remote 30582 1726855328.45210: _low_level_execute_command(): starting 30582 1726855328.45220: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/ /root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/AnsiballZ_service_facts.py && sleep 0' 30582 1726855328.46776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855328.46807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855328.46824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855328.46981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855328.47105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855328.47204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855328.49141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855328.49154: stdout chunk (state=3): >>><<< 30582 1726855328.49167: stderr chunk (state=3): >>><<< 30582 1726855328.49194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855328.49474: _low_level_execute_command(): starting 30582 1726855328.49478: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/AnsiballZ_service_facts.py && sleep 0' 30582 1726855328.50519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855328.50554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855328.50579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855328.50604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855328.50732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855330.05197: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30582 1726855330.05218: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30582 1726855330.05241: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integrat<<< 30582 1726855330.05268: stdout chunk (state=3): >>>ion.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855330.06799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855330.06823: stderr chunk (state=3): >>><<< 30582 1726855330.06827: stdout chunk (state=3): >>><<< 30582 1726855330.06859: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855330.07581: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855330.07590: _low_level_execute_command(): starting 30582 1726855330.07595: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855328.3605576-33607-235336355662833/ > /dev/null 2>&1 && sleep 0' 30582 1726855330.08050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855330.08053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.08056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855330.08058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.08114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855330.08117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855330.08123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855330.08183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855330.10008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855330.10034: stderr chunk (state=3): >>><<< 30582 1726855330.10037: stdout chunk (state=3): >>><<< 30582 1726855330.10053: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855330.10062: handler run complete 30582 1726855330.10177: variable 'ansible_facts' from source: unknown 30582 1726855330.10265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855330.10551: variable 'ansible_facts' from source: unknown 30582 1726855330.10636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855330.10750: attempt loop complete, returning result 30582 1726855330.10755: _execute() done 30582 1726855330.10758: dumping result to json 30582 1726855330.10795: done dumping result, returning 30582 1726855330.10804: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-0000000014bb] 30582 1726855330.10808: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014bb 30582 1726855330.11578: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014bb 30582 1726855330.11581: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855330.11638: no more pending results, returning what we have 30582 1726855330.11640: results queue empty 30582 1726855330.11641: checking for any_errors_fatal 30582 1726855330.11643: done checking for any_errors_fatal 30582 1726855330.11644: checking for max_fail_percentage 30582 1726855330.11645: done checking for max_fail_percentage 30582 1726855330.11645: checking to see if all hosts have failed and the running result is not ok 30582 1726855330.11646: done checking to see if all hosts have failed 30582 1726855330.11646: getting the remaining hosts for this loop 30582 1726855330.11647: done getting the remaining hosts for this loop 30582 1726855330.11650: getting the next task for host managed_node3 30582 1726855330.11654: done getting next task for host managed_node3 30582 1726855330.11656: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855330.11661: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855330.11667: getting variables 30582 1726855330.11670: in VariableManager get_vars() 30582 1726855330.11693: Calling all_inventory to load vars for managed_node3 30582 1726855330.11695: Calling groups_inventory to load vars for managed_node3 30582 1726855330.11696: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855330.11703: Calling all_plugins_play to load vars for managed_node3 30582 1726855330.11705: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855330.11711: Calling groups_plugins_play to load vars for managed_node3 30582 1726855330.12403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855330.13307: done with get_vars() 30582 1726855330.13330: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:02:10 -0400 (0:00:01.841) 0:01:06.484 ****** 30582 1726855330.13411: entering _queue_task() for managed_node3/package_facts 30582 1726855330.13691: worker is 1 (out of 1 available) 30582 1726855330.13706: exiting _queue_task() for managed_node3/package_facts 30582 1726855330.13719: done queuing things up, now waiting for results queue to drain 30582 1726855330.13721: waiting for pending results... 30582 1726855330.13911: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855330.14016: in run() - task 0affcc66-ac2b-aa83-7d57-0000000014bc 30582 1726855330.14029: variable 'ansible_search_path' from source: unknown 30582 1726855330.14033: variable 'ansible_search_path' from source: unknown 30582 1726855330.14064: calling self._execute() 30582 1726855330.14139: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855330.14143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855330.14152: variable 'omit' from source: magic vars 30582 1726855330.14439: variable 'ansible_distribution_major_version' from source: facts 30582 1726855330.14448: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855330.14454: variable 'omit' from source: magic vars 30582 1726855330.14506: variable 'omit' from source: magic vars 30582 1726855330.14529: variable 'omit' from source: magic vars 30582 1726855330.14563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855330.14592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855330.14612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855330.14625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855330.14635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855330.14660: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855330.14663: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855330.14666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855330.14743: Set connection var ansible_timeout to 10 30582 1726855330.14746: Set connection var ansible_connection to ssh 30582 1726855330.14751: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855330.14756: Set connection var ansible_pipelining to False 30582 1726855330.14761: Set connection var ansible_shell_executable to /bin/sh 30582 1726855330.14763: Set connection var ansible_shell_type to sh 30582 1726855330.14781: variable 'ansible_shell_executable' from source: unknown 30582 1726855330.14784: variable 'ansible_connection' from source: unknown 30582 1726855330.14789: variable 'ansible_module_compression' from source: unknown 30582 1726855330.14791: variable 'ansible_shell_type' from source: unknown 30582 1726855330.14794: variable 'ansible_shell_executable' from source: unknown 30582 1726855330.14796: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855330.14798: variable 'ansible_pipelining' from source: unknown 30582 1726855330.14800: variable 'ansible_timeout' from source: unknown 30582 1726855330.14804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855330.14952: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855330.14961: variable 'omit' from source: magic vars 30582 1726855330.14966: starting attempt loop 30582 1726855330.14972: running the handler 30582 1726855330.14981: _low_level_execute_command(): starting 30582 1726855330.14989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855330.15517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855330.15522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.15525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855330.15528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.15576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855330.15580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855330.15582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855330.15657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855330.17325: stdout chunk (state=3): >>>/root <<< 30582 1726855330.17423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855330.17457: stderr chunk (state=3): >>><<< 30582 1726855330.17460: stdout chunk (state=3): >>><<< 30582 1726855330.17483: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855330.17496: _low_level_execute_command(): starting 30582 1726855330.17502: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861 `" && echo ansible-tmp-1726855330.1748183-33692-92053763034861="` echo /root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861 `" ) && sleep 0' 30582 1726855330.17936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855330.17939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855330.17942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.17951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855330.17954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.18003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855330.18012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855330.18015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855330.18073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855330.19979: stdout chunk (state=3): >>>ansible-tmp-1726855330.1748183-33692-92053763034861=/root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861 <<< 30582 1726855330.20080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855330.20111: stderr chunk (state=3): >>><<< 30582 1726855330.20115: stdout chunk (state=3): >>><<< 30582 1726855330.20128: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855330.1748183-33692-92053763034861=/root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855330.20167: variable 'ansible_module_compression' from source: unknown 30582 1726855330.20210: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855330.20262: variable 'ansible_facts' from source: unknown 30582 1726855330.20384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/AnsiballZ_package_facts.py 30582 1726855330.20490: Sending initial data 30582 1726855330.20494: Sent initial data (161 bytes) 30582 1726855330.21138: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.21177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855330.21201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855330.21218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855330.21363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855330.22947: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855330.22960: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30582 1726855330.22980: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30582 1726855330.22993: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 30582 1726855330.23004: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 30582 1726855330.23014: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 30582 1726855330.23029: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855330.23111: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855330.23164: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpdfs1r3br /root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/AnsiballZ_package_facts.py <<< 30582 1726855330.23194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/AnsiballZ_package_facts.py" <<< 30582 1726855330.23252: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpdfs1r3br" to remote "/root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/AnsiballZ_package_facts.py" <<< 30582 1726855330.24935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855330.25010: stderr chunk (state=3): >>><<< 30582 1726855330.25015: stdout chunk (state=3): >>><<< 30582 1726855330.25030: done transferring module to remote 30582 1726855330.25045: _low_level_execute_command(): starting 30582 1726855330.25061: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/ /root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/AnsiballZ_package_facts.py && sleep 0' 30582 1726855330.25724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855330.25747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855330.25766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855330.25797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855330.25870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.25925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855330.25975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855330.26062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855330.27956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855330.27978: stdout chunk (state=3): >>><<< 30582 1726855330.27981: stderr chunk (state=3): >>><<< 30582 1726855330.28078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855330.28082: _low_level_execute_command(): starting 30582 1726855330.28085: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/AnsiballZ_package_facts.py && sleep 0' 30582 1726855330.28666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855330.28682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855330.28699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855330.28722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855330.28739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855330.28831: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.28858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855330.28879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855330.28905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855330.29021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855330.73308: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30582 1726855330.73489: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30582 1726855330.73502: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30582 1726855330.73531: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855330.75309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855330.75396: stderr chunk (state=3): >>><<< 30582 1726855330.75402: stdout chunk (state=3): >>><<< 30582 1726855330.75604: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855330.77664: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855330.77701: _low_level_execute_command(): starting 30582 1726855330.77711: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855330.1748183-33692-92053763034861/ > /dev/null 2>&1 && sleep 0' 30582 1726855330.78347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855330.78359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855330.78403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855330.78422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855330.78434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855330.78502: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855330.78526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855330.78544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855330.78565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855330.78659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855330.80639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855330.80644: stdout chunk (state=3): >>><<< 30582 1726855330.80649: stderr chunk (state=3): >>><<< 30582 1726855330.80690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855330.80696: handler run complete 30582 1726855330.81706: variable 'ansible_facts' from source: unknown 30582 1726855330.82224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855330.83299: variable 'ansible_facts' from source: unknown 30582 1726855330.83552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855330.84086: attempt loop complete, returning result 30582 1726855330.84093: _execute() done 30582 1726855330.84095: dumping result to json 30582 1726855330.84280: done dumping result, returning 30582 1726855330.84292: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-0000000014bc] 30582 1726855330.84492: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014bc 30582 1726855330.85942: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000014bc 30582 1726855330.85945: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855330.86042: no more pending results, returning what we have 30582 1726855330.86045: results queue empty 30582 1726855330.86045: checking for any_errors_fatal 30582 1726855330.86049: done checking for any_errors_fatal 30582 1726855330.86049: checking for max_fail_percentage 30582 1726855330.86051: done checking for max_fail_percentage 30582 1726855330.86051: checking to see if all hosts have failed and the running result is not ok 30582 1726855330.86052: done checking to see if all hosts have failed 30582 1726855330.86052: getting the remaining hosts for this loop 30582 1726855330.86053: done getting the remaining hosts for this loop 30582 1726855330.86056: getting the next task for host managed_node3 30582 1726855330.86061: done getting next task for host managed_node3 30582 1726855330.86064: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855330.86067: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855330.86077: getting variables 30582 1726855330.86078: in VariableManager get_vars() 30582 1726855330.86103: Calling all_inventory to load vars for managed_node3 30582 1726855330.86105: Calling groups_inventory to load vars for managed_node3 30582 1726855330.86106: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855330.86114: Calling all_plugins_play to load vars for managed_node3 30582 1726855330.86115: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855330.86117: Calling groups_plugins_play to load vars for managed_node3 30582 1726855330.86846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855330.88424: done with get_vars() 30582 1726855330.88455: done getting variables 30582 1726855330.88517: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:10 -0400 (0:00:00.751) 0:01:07.235 ****** 30582 1726855330.88545: entering _queue_task() for managed_node3/debug 30582 1726855330.88828: worker is 1 (out of 1 available) 30582 1726855330.88841: exiting _queue_task() for managed_node3/debug 30582 1726855330.88856: done queuing things up, now waiting for results queue to drain 30582 1726855330.88858: waiting for pending results... 30582 1726855330.89067: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855330.89160: in run() - task 0affcc66-ac2b-aa83-7d57-000000001460 30582 1726855330.89174: variable 'ansible_search_path' from source: unknown 30582 1726855330.89177: variable 'ansible_search_path' from source: unknown 30582 1726855330.89205: calling self._execute() 30582 1726855330.89277: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855330.89282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855330.89292: variable 'omit' from source: magic vars 30582 1726855330.89573: variable 'ansible_distribution_major_version' from source: facts 30582 1726855330.89581: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855330.89587: variable 'omit' from source: magic vars 30582 1726855330.89633: variable 'omit' from source: magic vars 30582 1726855330.89708: variable 'network_provider' from source: set_fact 30582 1726855330.89719: variable 'omit' from source: magic vars 30582 1726855330.89751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855330.89778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855330.89795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855330.89810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855330.89824: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855330.89847: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855330.89850: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855330.89853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855330.89932: Set connection var ansible_timeout to 10 30582 1726855330.89935: Set connection var ansible_connection to ssh 30582 1726855330.89938: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855330.89940: Set connection var ansible_pipelining to False 30582 1726855330.89942: Set connection var ansible_shell_executable to /bin/sh 30582 1726855330.89947: Set connection var ansible_shell_type to sh 30582 1726855330.89963: variable 'ansible_shell_executable' from source: unknown 30582 1726855330.89967: variable 'ansible_connection' from source: unknown 30582 1726855330.89973: variable 'ansible_module_compression' from source: unknown 30582 1726855330.89975: variable 'ansible_shell_type' from source: unknown 30582 1726855330.89978: variable 'ansible_shell_executable' from source: unknown 30582 1726855330.89980: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855330.89983: variable 'ansible_pipelining' from source: unknown 30582 1726855330.89985: variable 'ansible_timeout' from source: unknown 30582 1726855330.89986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855330.90115: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855330.90121: variable 'omit' from source: magic vars 30582 1726855330.90126: starting attempt loop 30582 1726855330.90129: running the handler 30582 1726855330.90186: handler run complete 30582 1726855330.90194: attempt loop complete, returning result 30582 1726855330.90197: _execute() done 30582 1726855330.90201: dumping result to json 30582 1726855330.90207: done dumping result, returning 30582 1726855330.90234: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-000000001460] 30582 1726855330.90238: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001460 30582 1726855330.90339: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001460 30582 1726855330.90342: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855330.90458: no more pending results, returning what we have 30582 1726855330.90461: results queue empty 30582 1726855330.90462: checking for any_errors_fatal 30582 1726855330.90467: done checking for any_errors_fatal 30582 1726855330.90470: checking for max_fail_percentage 30582 1726855330.90472: done checking for max_fail_percentage 30582 1726855330.90473: checking to see if all hosts have failed and the running result is not ok 30582 1726855330.90473: done checking to see if all hosts have failed 30582 1726855330.90474: getting the remaining hosts for this loop 30582 1726855330.90475: done getting the remaining hosts for this loop 30582 1726855330.90481: getting the next task for host managed_node3 30582 1726855330.90489: done getting next task for host managed_node3 30582 1726855330.90492: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855330.90496: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855330.90506: getting variables 30582 1726855330.90508: in VariableManager get_vars() 30582 1726855330.90539: Calling all_inventory to load vars for managed_node3 30582 1726855330.90542: Calling groups_inventory to load vars for managed_node3 30582 1726855330.90543: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855330.90551: Calling all_plugins_play to load vars for managed_node3 30582 1726855330.90554: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855330.90556: Calling groups_plugins_play to load vars for managed_node3 30582 1726855330.91908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855330.93084: done with get_vars() 30582 1726855330.93109: done getting variables 30582 1726855330.93155: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:10 -0400 (0:00:00.046) 0:01:07.281 ****** 30582 1726855330.93190: entering _queue_task() for managed_node3/fail 30582 1726855330.93459: worker is 1 (out of 1 available) 30582 1726855330.93474: exiting _queue_task() for managed_node3/fail 30582 1726855330.93485: done queuing things up, now waiting for results queue to drain 30582 1726855330.93486: waiting for pending results... 30582 1726855330.93678: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855330.93780: in run() - task 0affcc66-ac2b-aa83-7d57-000000001461 30582 1726855330.93794: variable 'ansible_search_path' from source: unknown 30582 1726855330.93797: variable 'ansible_search_path' from source: unknown 30582 1726855330.93829: calling self._execute() 30582 1726855330.93906: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855330.93910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855330.93920: variable 'omit' from source: magic vars 30582 1726855330.94492: variable 'ansible_distribution_major_version' from source: facts 30582 1726855330.94497: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855330.94551: variable 'network_state' from source: role '' defaults 30582 1726855330.94561: Evaluated conditional (network_state != {}): False 30582 1726855330.94565: when evaluation is False, skipping this task 30582 1726855330.94568: _execute() done 30582 1726855330.94570: dumping result to json 30582 1726855330.94576: done dumping result, returning 30582 1726855330.94585: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-000000001461] 30582 1726855330.94590: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001461 30582 1726855330.94684: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001461 30582 1726855330.94689: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855330.94765: no more pending results, returning what we have 30582 1726855330.94769: results queue empty 30582 1726855330.94770: checking for any_errors_fatal 30582 1726855330.94777: done checking for any_errors_fatal 30582 1726855330.94778: checking for max_fail_percentage 30582 1726855330.94780: done checking for max_fail_percentage 30582 1726855330.94781: checking to see if all hosts have failed and the running result is not ok 30582 1726855330.94782: done checking to see if all hosts have failed 30582 1726855330.94783: getting the remaining hosts for this loop 30582 1726855330.94784: done getting the remaining hosts for this loop 30582 1726855330.94791: getting the next task for host managed_node3 30582 1726855330.94801: done getting next task for host managed_node3 30582 1726855330.94805: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855330.94811: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855330.94839: getting variables 30582 1726855330.94841: in VariableManager get_vars() 30582 1726855330.94885: Calling all_inventory to load vars for managed_node3 30582 1726855330.94992: Calling groups_inventory to load vars for managed_node3 30582 1726855330.95001: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855330.95015: Calling all_plugins_play to load vars for managed_node3 30582 1726855330.95019: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855330.95022: Calling groups_plugins_play to load vars for managed_node3 30582 1726855330.96438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855330.97342: done with get_vars() 30582 1726855330.97367: done getting variables 30582 1726855330.97418: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:10 -0400 (0:00:00.042) 0:01:07.324 ****** 30582 1726855330.97445: entering _queue_task() for managed_node3/fail 30582 1726855330.97727: worker is 1 (out of 1 available) 30582 1726855330.97742: exiting _queue_task() for managed_node3/fail 30582 1726855330.97753: done queuing things up, now waiting for results queue to drain 30582 1726855330.97755: waiting for pending results... 30582 1726855330.97950: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855330.98050: in run() - task 0affcc66-ac2b-aa83-7d57-000000001462 30582 1726855330.98061: variable 'ansible_search_path' from source: unknown 30582 1726855330.98065: variable 'ansible_search_path' from source: unknown 30582 1726855330.98099: calling self._execute() 30582 1726855330.98167: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855330.98175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855330.98181: variable 'omit' from source: magic vars 30582 1726855330.98609: variable 'ansible_distribution_major_version' from source: facts 30582 1726855330.98613: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855330.98727: variable 'network_state' from source: role '' defaults 30582 1726855330.98744: Evaluated conditional (network_state != {}): False 30582 1726855330.98753: when evaluation is False, skipping this task 30582 1726855330.98760: _execute() done 30582 1726855330.98768: dumping result to json 30582 1726855330.98779: done dumping result, returning 30582 1726855330.98827: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-000000001462] 30582 1726855330.98831: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001462 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855330.98972: no more pending results, returning what we have 30582 1726855330.98976: results queue empty 30582 1726855330.98978: checking for any_errors_fatal 30582 1726855330.98985: done checking for any_errors_fatal 30582 1726855330.98986: checking for max_fail_percentage 30582 1726855330.98990: done checking for max_fail_percentage 30582 1726855330.98991: checking to see if all hosts have failed and the running result is not ok 30582 1726855330.98992: done checking to see if all hosts have failed 30582 1726855330.98993: getting the remaining hosts for this loop 30582 1726855330.98994: done getting the remaining hosts for this loop 30582 1726855330.98998: getting the next task for host managed_node3 30582 1726855330.99007: done getting next task for host managed_node3 30582 1726855330.99010: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855330.99015: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855330.99039: getting variables 30582 1726855330.99041: in VariableManager get_vars() 30582 1726855330.99081: Calling all_inventory to load vars for managed_node3 30582 1726855330.99084: Calling groups_inventory to load vars for managed_node3 30582 1726855330.99086: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855330.99209: Calling all_plugins_play to load vars for managed_node3 30582 1726855330.99213: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855330.99218: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001462 30582 1726855330.99221: WORKER PROCESS EXITING 30582 1726855330.99225: Calling groups_plugins_play to load vars for managed_node3 30582 1726855331.01130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855331.02813: done with get_vars() 30582 1726855331.02847: done getting variables 30582 1726855331.02920: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:11 -0400 (0:00:00.055) 0:01:07.379 ****** 30582 1726855331.02957: entering _queue_task() for managed_node3/fail 30582 1726855331.03382: worker is 1 (out of 1 available) 30582 1726855331.03398: exiting _queue_task() for managed_node3/fail 30582 1726855331.03411: done queuing things up, now waiting for results queue to drain 30582 1726855331.03412: waiting for pending results... 30582 1726855331.03708: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855331.03846: in run() - task 0affcc66-ac2b-aa83-7d57-000000001463 30582 1726855331.03864: variable 'ansible_search_path' from source: unknown 30582 1726855331.03882: variable 'ansible_search_path' from source: unknown 30582 1726855331.03905: calling self._execute() 30582 1726855331.04001: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855331.04006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855331.04042: variable 'omit' from source: magic vars 30582 1726855331.04422: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.04430: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855331.04645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855331.06928: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855331.06992: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855331.07021: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855331.07052: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855331.07074: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855331.07134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.07159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.07178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.07206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.07217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.07294: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.07308: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855331.07396: variable 'ansible_distribution' from source: facts 30582 1726855331.07399: variable '__network_rh_distros' from source: role '' defaults 30582 1726855331.07407: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855331.07566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.07590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.07606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.07633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.07643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.07676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.07697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.07715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.07739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.07749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.07778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.07803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.07817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.07840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.07851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.08051: variable 'network_connections' from source: include params 30582 1726855331.08061: variable 'interface' from source: play vars 30582 1726855331.08109: variable 'interface' from source: play vars 30582 1726855331.08123: variable 'network_state' from source: role '' defaults 30582 1726855331.08167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855331.08285: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855331.08313: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855331.08339: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855331.08359: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855331.08402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855331.08417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855331.08439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.08461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855331.08481: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855331.08484: when evaluation is False, skipping this task 30582 1726855331.08489: _execute() done 30582 1726855331.08492: dumping result to json 30582 1726855331.08494: done dumping result, returning 30582 1726855331.08501: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-000000001463] 30582 1726855331.08506: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001463 30582 1726855331.08596: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001463 30582 1726855331.08599: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855331.08649: no more pending results, returning what we have 30582 1726855331.08655: results queue empty 30582 1726855331.08657: checking for any_errors_fatal 30582 1726855331.08663: done checking for any_errors_fatal 30582 1726855331.08664: checking for max_fail_percentage 30582 1726855331.08671: done checking for max_fail_percentage 30582 1726855331.08672: checking to see if all hosts have failed and the running result is not ok 30582 1726855331.08673: done checking to see if all hosts have failed 30582 1726855331.08673: getting the remaining hosts for this loop 30582 1726855331.08675: done getting the remaining hosts for this loop 30582 1726855331.08679: getting the next task for host managed_node3 30582 1726855331.08689: done getting next task for host managed_node3 30582 1726855331.08693: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855331.08698: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855331.08725: getting variables 30582 1726855331.08727: in VariableManager get_vars() 30582 1726855331.08766: Calling all_inventory to load vars for managed_node3 30582 1726855331.08770: Calling groups_inventory to load vars for managed_node3 30582 1726855331.08773: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855331.08783: Calling all_plugins_play to load vars for managed_node3 30582 1726855331.08785: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855331.08797: Calling groups_plugins_play to load vars for managed_node3 30582 1726855331.10189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855331.11090: done with get_vars() 30582 1726855331.11111: done getting variables 30582 1726855331.11154: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:11 -0400 (0:00:00.082) 0:01:07.461 ****** 30582 1726855331.11180: entering _queue_task() for managed_node3/dnf 30582 1726855331.11466: worker is 1 (out of 1 available) 30582 1726855331.11480: exiting _queue_task() for managed_node3/dnf 30582 1726855331.11493: done queuing things up, now waiting for results queue to drain 30582 1726855331.11495: waiting for pending results... 30582 1726855331.11693: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855331.11791: in run() - task 0affcc66-ac2b-aa83-7d57-000000001464 30582 1726855331.11803: variable 'ansible_search_path' from source: unknown 30582 1726855331.11806: variable 'ansible_search_path' from source: unknown 30582 1726855331.11837: calling self._execute() 30582 1726855331.11913: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855331.11917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855331.11925: variable 'omit' from source: magic vars 30582 1726855331.12214: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.12223: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855331.12365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855331.15292: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855331.15340: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855331.15384: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855331.15410: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855331.15429: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855331.15498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.15518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.15535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.15561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.15575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.15657: variable 'ansible_distribution' from source: facts 30582 1726855331.15660: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.15685: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855331.15760: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855331.15992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.15995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.15998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.16000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.16003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.16049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.16076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.16108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.16160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.16231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.16235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.16260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.16292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.16345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.16365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.16509: variable 'network_connections' from source: include params 30582 1726855331.16525: variable 'interface' from source: play vars 30582 1726855331.16596: variable 'interface' from source: play vars 30582 1726855331.16777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855331.16857: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855331.16906: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855331.16938: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855331.16969: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855331.17024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855331.17051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855331.17133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.17190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855331.17325: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855331.18094: variable 'network_connections' from source: include params 30582 1726855331.18099: variable 'interface' from source: play vars 30582 1726855331.18101: variable 'interface' from source: play vars 30582 1726855331.18103: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855331.18105: when evaluation is False, skipping this task 30582 1726855331.18106: _execute() done 30582 1726855331.18108: dumping result to json 30582 1726855331.18109: done dumping result, returning 30582 1726855331.18111: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001464] 30582 1726855331.18113: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001464 30582 1726855331.18180: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001464 30582 1726855331.18183: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855331.18245: no more pending results, returning what we have 30582 1726855331.18249: results queue empty 30582 1726855331.18250: checking for any_errors_fatal 30582 1726855331.18257: done checking for any_errors_fatal 30582 1726855331.18258: checking for max_fail_percentage 30582 1726855331.18260: done checking for max_fail_percentage 30582 1726855331.18261: checking to see if all hosts have failed and the running result is not ok 30582 1726855331.18261: done checking to see if all hosts have failed 30582 1726855331.18262: getting the remaining hosts for this loop 30582 1726855331.18263: done getting the remaining hosts for this loop 30582 1726855331.18267: getting the next task for host managed_node3 30582 1726855331.18277: done getting next task for host managed_node3 30582 1726855331.18281: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855331.18286: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855331.18308: getting variables 30582 1726855331.18309: in VariableManager get_vars() 30582 1726855331.18347: Calling all_inventory to load vars for managed_node3 30582 1726855331.18350: Calling groups_inventory to load vars for managed_node3 30582 1726855331.18352: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855331.18361: Calling all_plugins_play to load vars for managed_node3 30582 1726855331.18364: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855331.18366: Calling groups_plugins_play to load vars for managed_node3 30582 1726855331.20941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855331.22574: done with get_vars() 30582 1726855331.22609: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855331.22680: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:11 -0400 (0:00:00.115) 0:01:07.577 ****** 30582 1726855331.22714: entering _queue_task() for managed_node3/yum 30582 1726855331.23296: worker is 1 (out of 1 available) 30582 1726855331.23307: exiting _queue_task() for managed_node3/yum 30582 1726855331.23318: done queuing things up, now waiting for results queue to drain 30582 1726855331.23319: waiting for pending results... 30582 1726855331.23416: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855331.23555: in run() - task 0affcc66-ac2b-aa83-7d57-000000001465 30582 1726855331.23579: variable 'ansible_search_path' from source: unknown 30582 1726855331.23583: variable 'ansible_search_path' from source: unknown 30582 1726855331.23690: calling self._execute() 30582 1726855331.23703: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855331.23707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855331.23720: variable 'omit' from source: magic vars 30582 1726855331.24267: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.24277: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855331.24502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855331.26780: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855331.26865: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855331.26945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855331.26949: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855331.26967: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855331.27053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.27078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.27104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.27161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.27164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.27377: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.27380: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855331.27382: when evaluation is False, skipping this task 30582 1726855331.27383: _execute() done 30582 1726855331.27385: dumping result to json 30582 1726855331.27389: done dumping result, returning 30582 1726855331.27391: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001465] 30582 1726855331.27393: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001465 30582 1726855331.27462: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001465 30582 1726855331.27465: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855331.27534: no more pending results, returning what we have 30582 1726855331.27539: results queue empty 30582 1726855331.27540: checking for any_errors_fatal 30582 1726855331.27547: done checking for any_errors_fatal 30582 1726855331.27547: checking for max_fail_percentage 30582 1726855331.27550: done checking for max_fail_percentage 30582 1726855331.27551: checking to see if all hosts have failed and the running result is not ok 30582 1726855331.27552: done checking to see if all hosts have failed 30582 1726855331.27553: getting the remaining hosts for this loop 30582 1726855331.27554: done getting the remaining hosts for this loop 30582 1726855331.27559: getting the next task for host managed_node3 30582 1726855331.27567: done getting next task for host managed_node3 30582 1726855331.27571: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855331.27577: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855331.27605: getting variables 30582 1726855331.27607: in VariableManager get_vars() 30582 1726855331.27650: Calling all_inventory to load vars for managed_node3 30582 1726855331.27653: Calling groups_inventory to load vars for managed_node3 30582 1726855331.27655: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855331.27666: Calling all_plugins_play to load vars for managed_node3 30582 1726855331.27669: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855331.27672: Calling groups_plugins_play to load vars for managed_node3 30582 1726855331.29627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855331.36860: done with get_vars() 30582 1726855331.36895: done getting variables 30582 1726855331.36944: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:11 -0400 (0:00:00.142) 0:01:07.719 ****** 30582 1726855331.36979: entering _queue_task() for managed_node3/fail 30582 1726855331.37359: worker is 1 (out of 1 available) 30582 1726855331.37376: exiting _queue_task() for managed_node3/fail 30582 1726855331.37392: done queuing things up, now waiting for results queue to drain 30582 1726855331.37394: waiting for pending results... 30582 1726855331.37631: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855331.37738: in run() - task 0affcc66-ac2b-aa83-7d57-000000001466 30582 1726855331.37751: variable 'ansible_search_path' from source: unknown 30582 1726855331.37756: variable 'ansible_search_path' from source: unknown 30582 1726855331.37786: calling self._execute() 30582 1726855331.37861: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855331.37867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855331.37879: variable 'omit' from source: magic vars 30582 1726855331.38172: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.38185: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855331.38272: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855331.38412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855331.40492: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855331.40497: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855331.40598: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855331.40602: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855331.40612: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855331.40647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.40678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.40705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.40743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.40758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.40811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.40833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.40857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.40900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.40917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.40953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.40976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.41003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.41045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.41052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.41223: variable 'network_connections' from source: include params 30582 1726855331.41234: variable 'interface' from source: play vars 30582 1726855331.41292: variable 'interface' from source: play vars 30582 1726855331.41345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855331.41471: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855331.41501: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855331.41524: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855331.41546: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855331.41579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855331.41599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855331.41616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.41635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855331.41682: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855331.41842: variable 'network_connections' from source: include params 30582 1726855331.41846: variable 'interface' from source: play vars 30582 1726855331.41892: variable 'interface' from source: play vars 30582 1726855331.41914: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855331.41918: when evaluation is False, skipping this task 30582 1726855331.41920: _execute() done 30582 1726855331.41923: dumping result to json 30582 1726855331.41925: done dumping result, returning 30582 1726855331.41932: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001466] 30582 1726855331.41937: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001466 30582 1726855331.42029: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001466 30582 1726855331.42031: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855331.42085: no more pending results, returning what we have 30582 1726855331.42091: results queue empty 30582 1726855331.42092: checking for any_errors_fatal 30582 1726855331.42101: done checking for any_errors_fatal 30582 1726855331.42101: checking for max_fail_percentage 30582 1726855331.42103: done checking for max_fail_percentage 30582 1726855331.42104: checking to see if all hosts have failed and the running result is not ok 30582 1726855331.42105: done checking to see if all hosts have failed 30582 1726855331.42105: getting the remaining hosts for this loop 30582 1726855331.42107: done getting the remaining hosts for this loop 30582 1726855331.42110: getting the next task for host managed_node3 30582 1726855331.42118: done getting next task for host managed_node3 30582 1726855331.42122: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855331.42127: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855331.42151: getting variables 30582 1726855331.42153: in VariableManager get_vars() 30582 1726855331.42193: Calling all_inventory to load vars for managed_node3 30582 1726855331.42196: Calling groups_inventory to load vars for managed_node3 30582 1726855331.42198: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855331.42208: Calling all_plugins_play to load vars for managed_node3 30582 1726855331.42211: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855331.42213: Calling groups_plugins_play to load vars for managed_node3 30582 1726855331.43154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855331.44449: done with get_vars() 30582 1726855331.44474: done getting variables 30582 1726855331.44522: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:11 -0400 (0:00:00.075) 0:01:07.795 ****** 30582 1726855331.44549: entering _queue_task() for managed_node3/package 30582 1726855331.44822: worker is 1 (out of 1 available) 30582 1726855331.44837: exiting _queue_task() for managed_node3/package 30582 1726855331.44851: done queuing things up, now waiting for results queue to drain 30582 1726855331.44853: waiting for pending results... 30582 1726855331.45049: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855331.45161: in run() - task 0affcc66-ac2b-aa83-7d57-000000001467 30582 1726855331.45174: variable 'ansible_search_path' from source: unknown 30582 1726855331.45179: variable 'ansible_search_path' from source: unknown 30582 1726855331.45215: calling self._execute() 30582 1726855331.45292: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855331.45298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855331.45312: variable 'omit' from source: magic vars 30582 1726855331.45598: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.45608: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855331.45745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855331.45940: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855331.45977: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855331.46006: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855331.46065: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855331.46149: variable 'network_packages' from source: role '' defaults 30582 1726855331.46226: variable '__network_provider_setup' from source: role '' defaults 30582 1726855331.46235: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855331.46281: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855331.46293: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855331.46336: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855331.46455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855331.47844: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855331.47890: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855331.47921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855331.47946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855331.47966: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855331.48319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.48339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.48362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.48391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.48402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.48434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.48450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.48470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.48499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.48509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.48653: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855331.48733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.48749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.48766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.48794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.48808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.48874: variable 'ansible_python' from source: facts 30582 1726855331.48884: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855331.48965: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855331.49029: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855331.49116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.49135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.49152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.49179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.49191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.49222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.49244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.49261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.49289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.49300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.49403: variable 'network_connections' from source: include params 30582 1726855331.49410: variable 'interface' from source: play vars 30582 1726855331.49486: variable 'interface' from source: play vars 30582 1726855331.49536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855331.49557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855331.49584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.49607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855331.49643: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855331.49829: variable 'network_connections' from source: include params 30582 1726855331.49833: variable 'interface' from source: play vars 30582 1726855331.49908: variable 'interface' from source: play vars 30582 1726855331.49931: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855331.49988: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855331.50191: variable 'network_connections' from source: include params 30582 1726855331.50196: variable 'interface' from source: play vars 30582 1726855331.50243: variable 'interface' from source: play vars 30582 1726855331.50260: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855331.50318: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855331.50518: variable 'network_connections' from source: include params 30582 1726855331.50521: variable 'interface' from source: play vars 30582 1726855331.50570: variable 'interface' from source: play vars 30582 1726855331.50609: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855331.50661: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855331.50667: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855331.50791: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855331.50979: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855331.51386: variable 'network_connections' from source: include params 30582 1726855331.51391: variable 'interface' from source: play vars 30582 1726855331.51447: variable 'interface' from source: play vars 30582 1726855331.51454: variable 'ansible_distribution' from source: facts 30582 1726855331.51457: variable '__network_rh_distros' from source: role '' defaults 30582 1726855331.51465: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.51492: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855331.51641: variable 'ansible_distribution' from source: facts 30582 1726855331.51644: variable '__network_rh_distros' from source: role '' defaults 30582 1726855331.51646: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.51659: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855331.51835: variable 'ansible_distribution' from source: facts 30582 1726855331.51838: variable '__network_rh_distros' from source: role '' defaults 30582 1726855331.51841: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.51900: variable 'network_provider' from source: set_fact 30582 1726855331.51903: variable 'ansible_facts' from source: unknown 30582 1726855331.52311: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855331.52314: when evaluation is False, skipping this task 30582 1726855331.52317: _execute() done 30582 1726855331.52319: dumping result to json 30582 1726855331.52329: done dumping result, returning 30582 1726855331.52331: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000001467] 30582 1726855331.52334: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001467 30582 1726855331.52431: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001467 30582 1726855331.52435: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855331.52490: no more pending results, returning what we have 30582 1726855331.52493: results queue empty 30582 1726855331.52494: checking for any_errors_fatal 30582 1726855331.52501: done checking for any_errors_fatal 30582 1726855331.52502: checking for max_fail_percentage 30582 1726855331.52504: done checking for max_fail_percentage 30582 1726855331.52505: checking to see if all hosts have failed and the running result is not ok 30582 1726855331.52505: done checking to see if all hosts have failed 30582 1726855331.52506: getting the remaining hosts for this loop 30582 1726855331.52508: done getting the remaining hosts for this loop 30582 1726855331.52511: getting the next task for host managed_node3 30582 1726855331.52519: done getting next task for host managed_node3 30582 1726855331.52523: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855331.52528: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855331.52558: getting variables 30582 1726855331.52560: in VariableManager get_vars() 30582 1726855331.52604: Calling all_inventory to load vars for managed_node3 30582 1726855331.52607: Calling groups_inventory to load vars for managed_node3 30582 1726855331.52609: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855331.52619: Calling all_plugins_play to load vars for managed_node3 30582 1726855331.52622: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855331.52624: Calling groups_plugins_play to load vars for managed_node3 30582 1726855331.53614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855331.54542: done with get_vars() 30582 1726855331.54568: done getting variables 30582 1726855331.54630: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:11 -0400 (0:00:00.101) 0:01:07.896 ****** 30582 1726855331.54669: entering _queue_task() for managed_node3/package 30582 1726855331.55040: worker is 1 (out of 1 available) 30582 1726855331.55053: exiting _queue_task() for managed_node3/package 30582 1726855331.55066: done queuing things up, now waiting for results queue to drain 30582 1726855331.55067: waiting for pending results... 30582 1726855331.55506: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855331.55536: in run() - task 0affcc66-ac2b-aa83-7d57-000000001468 30582 1726855331.55563: variable 'ansible_search_path' from source: unknown 30582 1726855331.55572: variable 'ansible_search_path' from source: unknown 30582 1726855331.55617: calling self._execute() 30582 1726855331.55713: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855331.55725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855331.55738: variable 'omit' from source: magic vars 30582 1726855331.56128: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.56150: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855331.56275: variable 'network_state' from source: role '' defaults 30582 1726855331.56362: Evaluated conditional (network_state != {}): False 30582 1726855331.56365: when evaluation is False, skipping this task 30582 1726855331.56367: _execute() done 30582 1726855331.56369: dumping result to json 30582 1726855331.56371: done dumping result, returning 30582 1726855331.56373: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001468] 30582 1726855331.56376: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001468 30582 1726855331.56453: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001468 30582 1726855331.56457: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855331.56518: no more pending results, returning what we have 30582 1726855331.56523: results queue empty 30582 1726855331.56524: checking for any_errors_fatal 30582 1726855331.56534: done checking for any_errors_fatal 30582 1726855331.56535: checking for max_fail_percentage 30582 1726855331.56538: done checking for max_fail_percentage 30582 1726855331.56539: checking to see if all hosts have failed and the running result is not ok 30582 1726855331.56539: done checking to see if all hosts have failed 30582 1726855331.56540: getting the remaining hosts for this loop 30582 1726855331.56542: done getting the remaining hosts for this loop 30582 1726855331.56547: getting the next task for host managed_node3 30582 1726855331.56557: done getting next task for host managed_node3 30582 1726855331.56562: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855331.56568: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855331.56599: getting variables 30582 1726855331.56602: in VariableManager get_vars() 30582 1726855331.56647: Calling all_inventory to load vars for managed_node3 30582 1726855331.56651: Calling groups_inventory to load vars for managed_node3 30582 1726855331.56654: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855331.56669: Calling all_plugins_play to load vars for managed_node3 30582 1726855331.56672: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855331.56675: Calling groups_plugins_play to load vars for managed_node3 30582 1726855331.58297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855331.59841: done with get_vars() 30582 1726855331.59876: done getting variables 30582 1726855331.59942: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:11 -0400 (0:00:00.053) 0:01:07.949 ****** 30582 1726855331.59978: entering _queue_task() for managed_node3/package 30582 1726855331.60350: worker is 1 (out of 1 available) 30582 1726855331.60366: exiting _queue_task() for managed_node3/package 30582 1726855331.60380: done queuing things up, now waiting for results queue to drain 30582 1726855331.60382: waiting for pending results... 30582 1726855331.60808: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855331.60863: in run() - task 0affcc66-ac2b-aa83-7d57-000000001469 30582 1726855331.60886: variable 'ansible_search_path' from source: unknown 30582 1726855331.60900: variable 'ansible_search_path' from source: unknown 30582 1726855331.60992: calling self._execute() 30582 1726855331.61051: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855331.61064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855331.61082: variable 'omit' from source: magic vars 30582 1726855331.61868: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.62157: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855331.62195: variable 'network_state' from source: role '' defaults 30582 1726855331.62212: Evaluated conditional (network_state != {}): False 30582 1726855331.62220: when evaluation is False, skipping this task 30582 1726855331.62270: _execute() done 30582 1726855331.62279: dumping result to json 30582 1726855331.62289: done dumping result, returning 30582 1726855331.62302: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001469] 30582 1726855331.62313: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001469 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855331.62478: no more pending results, returning what we have 30582 1726855331.62483: results queue empty 30582 1726855331.62484: checking for any_errors_fatal 30582 1726855331.62494: done checking for any_errors_fatal 30582 1726855331.62495: checking for max_fail_percentage 30582 1726855331.62498: done checking for max_fail_percentage 30582 1726855331.62499: checking to see if all hosts have failed and the running result is not ok 30582 1726855331.62500: done checking to see if all hosts have failed 30582 1726855331.62500: getting the remaining hosts for this loop 30582 1726855331.62502: done getting the remaining hosts for this loop 30582 1726855331.62506: getting the next task for host managed_node3 30582 1726855331.62518: done getting next task for host managed_node3 30582 1726855331.62523: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855331.62530: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855331.62556: getting variables 30582 1726855331.62558: in VariableManager get_vars() 30582 1726855331.62934: Calling all_inventory to load vars for managed_node3 30582 1726855331.62938: Calling groups_inventory to load vars for managed_node3 30582 1726855331.62941: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855331.62955: Calling all_plugins_play to load vars for managed_node3 30582 1726855331.62958: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855331.62962: Calling groups_plugins_play to load vars for managed_node3 30582 1726855331.63688: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001469 30582 1726855331.63692: WORKER PROCESS EXITING 30582 1726855331.65743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855331.69584: done with get_vars() 30582 1726855331.69619: done getting variables 30582 1726855331.69814: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:11 -0400 (0:00:00.098) 0:01:08.048 ****** 30582 1726855331.69853: entering _queue_task() for managed_node3/service 30582 1726855331.70482: worker is 1 (out of 1 available) 30582 1726855331.70499: exiting _queue_task() for managed_node3/service 30582 1726855331.70513: done queuing things up, now waiting for results queue to drain 30582 1726855331.70514: waiting for pending results... 30582 1726855331.70834: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855331.71133: in run() - task 0affcc66-ac2b-aa83-7d57-00000000146a 30582 1726855331.71148: variable 'ansible_search_path' from source: unknown 30582 1726855331.71152: variable 'ansible_search_path' from source: unknown 30582 1726855331.71191: calling self._execute() 30582 1726855331.71311: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855331.71395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855331.71398: variable 'omit' from source: magic vars 30582 1726855331.71762: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.71893: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855331.71910: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855331.72124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855331.76195: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855331.76281: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855331.76322: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855331.76354: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855331.76396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855331.76469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.76512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.76534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.76578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.76601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.76793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.76796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.76799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.76801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.76803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.76805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.76833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.76856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.76897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.76911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.77115: variable 'network_connections' from source: include params 30582 1726855331.77292: variable 'interface' from source: play vars 30582 1726855331.77295: variable 'interface' from source: play vars 30582 1726855331.77297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855331.77475: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855331.77528: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855331.77558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855331.77599: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855331.77641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855331.77662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855331.77700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.77725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855331.77776: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855331.78053: variable 'network_connections' from source: include params 30582 1726855331.78059: variable 'interface' from source: play vars 30582 1726855331.78133: variable 'interface' from source: play vars 30582 1726855331.78158: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855331.78162: when evaluation is False, skipping this task 30582 1726855331.78165: _execute() done 30582 1726855331.78167: dumping result to json 30582 1726855331.78169: done dumping result, returning 30582 1726855331.78181: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000146a] 30582 1726855331.78188: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146a 30582 1726855331.78285: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146a skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855331.78510: no more pending results, returning what we have 30582 1726855331.78514: results queue empty 30582 1726855331.78516: checking for any_errors_fatal 30582 1726855331.78522: done checking for any_errors_fatal 30582 1726855331.78523: checking for max_fail_percentage 30582 1726855331.78525: done checking for max_fail_percentage 30582 1726855331.78526: checking to see if all hosts have failed and the running result is not ok 30582 1726855331.78527: done checking to see if all hosts have failed 30582 1726855331.78528: getting the remaining hosts for this loop 30582 1726855331.78529: done getting the remaining hosts for this loop 30582 1726855331.78533: getting the next task for host managed_node3 30582 1726855331.78541: done getting next task for host managed_node3 30582 1726855331.78545: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855331.78495: WORKER PROCESS EXITING 30582 1726855331.78741: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855331.79009: getting variables 30582 1726855331.79011: in VariableManager get_vars() 30582 1726855331.79047: Calling all_inventory to load vars for managed_node3 30582 1726855331.79049: Calling groups_inventory to load vars for managed_node3 30582 1726855331.79051: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855331.79061: Calling all_plugins_play to load vars for managed_node3 30582 1726855331.79064: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855331.79066: Calling groups_plugins_play to load vars for managed_node3 30582 1726855331.81857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855331.85460: done with get_vars() 30582 1726855331.85638: done getting variables 30582 1726855331.85706: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:11 -0400 (0:00:00.159) 0:01:08.207 ****** 30582 1726855331.85803: entering _queue_task() for managed_node3/service 30582 1726855331.86656: worker is 1 (out of 1 available) 30582 1726855331.86674: exiting _queue_task() for managed_node3/service 30582 1726855331.86723: done queuing things up, now waiting for results queue to drain 30582 1726855331.86726: waiting for pending results... 30582 1726855331.87243: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855331.87573: in run() - task 0affcc66-ac2b-aa83-7d57-00000000146b 30582 1726855331.87616: variable 'ansible_search_path' from source: unknown 30582 1726855331.87625: variable 'ansible_search_path' from source: unknown 30582 1726855331.87684: calling self._execute() 30582 1726855331.87975: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855331.87990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855331.88007: variable 'omit' from source: magic vars 30582 1726855331.89294: variable 'ansible_distribution_major_version' from source: facts 30582 1726855331.89297: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855331.89426: variable 'network_provider' from source: set_fact 30582 1726855331.89892: variable 'network_state' from source: role '' defaults 30582 1726855331.89895: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855331.89897: variable 'omit' from source: magic vars 30582 1726855331.89899: variable 'omit' from source: magic vars 30582 1726855331.89967: variable 'network_service_name' from source: role '' defaults 30582 1726855331.90392: variable 'network_service_name' from source: role '' defaults 30582 1726855331.90395: variable '__network_provider_setup' from source: role '' defaults 30582 1726855331.90397: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855331.90441: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855331.90892: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855331.90896: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855331.90993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855331.96132: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855331.96373: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855331.96417: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855331.96456: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855331.96623: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855331.96708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.96743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.96820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.96939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.96959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.97102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.97201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.97228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.97275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.97326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.97588: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855331.97742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.97766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.97795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.97842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.97860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.97972: variable 'ansible_python' from source: facts 30582 1726855331.97993: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855331.98092: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855331.98186: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855331.98325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.98346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.98383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.98426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.98440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.98503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855331.98526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855331.98608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.98612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855331.98812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855331.99092: variable 'network_connections' from source: include params 30582 1726855331.99115: variable 'interface' from source: play vars 30582 1726855331.99330: variable 'interface' from source: play vars 30582 1726855331.99437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855331.99641: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855331.99698: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855331.99746: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855331.99798: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855331.99892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855331.99895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855331.99929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855331.99964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855332.00020: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855332.00281: variable 'network_connections' from source: include params 30582 1726855332.00393: variable 'interface' from source: play vars 30582 1726855332.00396: variable 'interface' from source: play vars 30582 1726855332.00407: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855332.00493: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855332.00791: variable 'network_connections' from source: include params 30582 1726855332.00805: variable 'interface' from source: play vars 30582 1726855332.00881: variable 'interface' from source: play vars 30582 1726855332.00913: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855332.01000: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855332.01263: variable 'network_connections' from source: include params 30582 1726855332.01273: variable 'interface' from source: play vars 30582 1726855332.01343: variable 'interface' from source: play vars 30582 1726855332.01448: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855332.01892: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855332.01896: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855332.01898: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855332.02124: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855332.02854: variable 'network_connections' from source: include params 30582 1726855332.02864: variable 'interface' from source: play vars 30582 1726855332.02941: variable 'interface' from source: play vars 30582 1726855332.02954: variable 'ansible_distribution' from source: facts 30582 1726855332.02963: variable '__network_rh_distros' from source: role '' defaults 30582 1726855332.02973: variable 'ansible_distribution_major_version' from source: facts 30582 1726855332.02994: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855332.03162: variable 'ansible_distribution' from source: facts 30582 1726855332.03170: variable '__network_rh_distros' from source: role '' defaults 30582 1726855332.03180: variable 'ansible_distribution_major_version' from source: facts 30582 1726855332.03202: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855332.03375: variable 'ansible_distribution' from source: facts 30582 1726855332.03385: variable '__network_rh_distros' from source: role '' defaults 30582 1726855332.03398: variable 'ansible_distribution_major_version' from source: facts 30582 1726855332.03437: variable 'network_provider' from source: set_fact 30582 1726855332.03464: variable 'omit' from source: magic vars 30582 1726855332.03498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855332.03531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855332.03555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855332.03577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855332.03596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855332.03630: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855332.03638: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855332.03647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855332.03753: Set connection var ansible_timeout to 10 30582 1726855332.03762: Set connection var ansible_connection to ssh 30582 1726855332.03775: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855332.03785: Set connection var ansible_pipelining to False 30582 1726855332.03798: Set connection var ansible_shell_executable to /bin/sh 30582 1726855332.03805: Set connection var ansible_shell_type to sh 30582 1726855332.03833: variable 'ansible_shell_executable' from source: unknown 30582 1726855332.03844: variable 'ansible_connection' from source: unknown 30582 1726855332.03853: variable 'ansible_module_compression' from source: unknown 30582 1726855332.03861: variable 'ansible_shell_type' from source: unknown 30582 1726855332.03869: variable 'ansible_shell_executable' from source: unknown 30582 1726855332.03876: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855332.03992: variable 'ansible_pipelining' from source: unknown 30582 1726855332.03995: variable 'ansible_timeout' from source: unknown 30582 1726855332.03997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855332.04003: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855332.04021: variable 'omit' from source: magic vars 30582 1726855332.04029: starting attempt loop 30582 1726855332.04034: running the handler 30582 1726855332.04107: variable 'ansible_facts' from source: unknown 30582 1726855332.04899: _low_level_execute_command(): starting 30582 1726855332.04912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855332.05724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855332.05807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855332.05842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855332.05863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855332.05890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855332.06124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855332.07715: stdout chunk (state=3): >>>/root <<< 30582 1726855332.07871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855332.07874: stdout chunk (state=3): >>><<< 30582 1726855332.07877: stderr chunk (state=3): >>><<< 30582 1726855332.07995: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855332.07998: _low_level_execute_command(): starting 30582 1726855332.08002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604 `" && echo ansible-tmp-1726855332.0790577-33781-99148697534604="` echo /root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604 `" ) && sleep 0' 30582 1726855332.08574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855332.08591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855332.08605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855332.08642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855332.08659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855332.08703: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855332.08766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855332.08793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855332.08814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855332.08901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855332.10856: stdout chunk (state=3): >>>ansible-tmp-1726855332.0790577-33781-99148697534604=/root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604 <<< 30582 1726855332.11294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855332.11297: stdout chunk (state=3): >>><<< 30582 1726855332.11300: stderr chunk (state=3): >>><<< 30582 1726855332.11302: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855332.0790577-33781-99148697534604=/root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855332.11305: variable 'ansible_module_compression' from source: unknown 30582 1726855332.11364: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855332.11426: variable 'ansible_facts' from source: unknown 30582 1726855332.11658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/AnsiballZ_systemd.py 30582 1726855332.11910: Sending initial data 30582 1726855332.11913: Sent initial data (155 bytes) 30582 1726855332.12502: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855332.12693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855332.12696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855332.12703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855332.12706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855332.12708: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855332.12710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855332.12712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855332.12714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855332.12716: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855332.12718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855332.12720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855332.12722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855332.12724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855332.12726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855332.12729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855332.12811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855332.14414: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855332.14498: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855332.14561: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp3alpigas /root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/AnsiballZ_systemd.py <<< 30582 1726855332.14565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/AnsiballZ_systemd.py" <<< 30582 1726855332.14621: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp3alpigas" to remote "/root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/AnsiballZ_systemd.py" <<< 30582 1726855332.16862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855332.16866: stdout chunk (state=3): >>><<< 30582 1726855332.16868: stderr chunk (state=3): >>><<< 30582 1726855332.16870: done transferring module to remote 30582 1726855332.16872: _low_level_execute_command(): starting 30582 1726855332.16874: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/ /root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/AnsiballZ_systemd.py && sleep 0' 30582 1726855332.17501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855332.17602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855332.17638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855332.17653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855332.17673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855332.17764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855332.19613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855332.19625: stdout chunk (state=3): >>><<< 30582 1726855332.19637: stderr chunk (state=3): >>><<< 30582 1726855332.19657: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855332.19666: _low_level_execute_command(): starting 30582 1726855332.19677: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/AnsiballZ_systemd.py && sleep 0' 30582 1726855332.20291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855332.20307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855332.20323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855332.20341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855332.20438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855332.20510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855332.20599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855332.49699: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10661888", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317489664", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2153699000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855332.51645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855332.51649: stdout chunk (state=3): >>><<< 30582 1726855332.51652: stderr chunk (state=3): >>><<< 30582 1726855332.51914: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10661888", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317489664", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2153699000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855332.52089: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855332.52219: _low_level_execute_command(): starting 30582 1726855332.52236: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855332.0790577-33781-99148697534604/ > /dev/null 2>&1 && sleep 0' 30582 1726855332.53288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855332.53351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855332.53374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855332.53392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855332.53477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855332.55321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855332.55351: stderr chunk (state=3): >>><<< 30582 1726855332.55353: stdout chunk (state=3): >>><<< 30582 1726855332.55371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855332.55375: handler run complete 30582 1726855332.55423: attempt loop complete, returning result 30582 1726855332.55426: _execute() done 30582 1726855332.55428: dumping result to json 30582 1726855332.55500: done dumping result, returning 30582 1726855332.55504: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-00000000146b] 30582 1726855332.55506: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146b 30582 1726855332.56439: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146b 30582 1726855332.56442: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855332.56497: no more pending results, returning what we have 30582 1726855332.56500: results queue empty 30582 1726855332.56501: checking for any_errors_fatal 30582 1726855332.56505: done checking for any_errors_fatal 30582 1726855332.56506: checking for max_fail_percentage 30582 1726855332.56507: done checking for max_fail_percentage 30582 1726855332.56508: checking to see if all hosts have failed and the running result is not ok 30582 1726855332.56509: done checking to see if all hosts have failed 30582 1726855332.56510: getting the remaining hosts for this loop 30582 1726855332.56511: done getting the remaining hosts for this loop 30582 1726855332.56514: getting the next task for host managed_node3 30582 1726855332.56521: done getting next task for host managed_node3 30582 1726855332.56524: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855332.56531: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855332.56543: getting variables 30582 1726855332.56545: in VariableManager get_vars() 30582 1726855332.56575: Calling all_inventory to load vars for managed_node3 30582 1726855332.56578: Calling groups_inventory to load vars for managed_node3 30582 1726855332.56581: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855332.56593: Calling all_plugins_play to load vars for managed_node3 30582 1726855332.56597: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855332.56600: Calling groups_plugins_play to load vars for managed_node3 30582 1726855332.57907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855332.58812: done with get_vars() 30582 1726855332.58833: done getting variables 30582 1726855332.58881: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:12 -0400 (0:00:00.731) 0:01:08.939 ****** 30582 1726855332.58914: entering _queue_task() for managed_node3/service 30582 1726855332.59184: worker is 1 (out of 1 available) 30582 1726855332.59200: exiting _queue_task() for managed_node3/service 30582 1726855332.59213: done queuing things up, now waiting for results queue to drain 30582 1726855332.59215: waiting for pending results... 30582 1726855332.59421: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855332.59528: in run() - task 0affcc66-ac2b-aa83-7d57-00000000146c 30582 1726855332.59581: variable 'ansible_search_path' from source: unknown 30582 1726855332.59586: variable 'ansible_search_path' from source: unknown 30582 1726855332.59609: calling self._execute() 30582 1726855332.59810: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855332.59814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855332.59817: variable 'omit' from source: magic vars 30582 1726855332.60299: variable 'ansible_distribution_major_version' from source: facts 30582 1726855332.60303: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855332.60305: variable 'network_provider' from source: set_fact 30582 1726855332.60307: Evaluated conditional (network_provider == "nm"): True 30582 1726855332.60332: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855332.60426: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855332.60606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855332.62878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855332.62931: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855332.62958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855332.62984: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855332.63006: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855332.63210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855332.63233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855332.63254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855332.63309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855332.63313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855332.63349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855332.63375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855332.63417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855332.63434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855332.63445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855332.63489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855332.63516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855332.63540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855332.63575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855332.63619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855332.63801: variable 'network_connections' from source: include params 30582 1726855332.63805: variable 'interface' from source: play vars 30582 1726855332.63854: variable 'interface' from source: play vars 30582 1726855332.63924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855332.64091: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855332.64140: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855332.64182: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855332.64203: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855332.64257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855332.64273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855332.64299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855332.64328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855332.64374: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855332.64614: variable 'network_connections' from source: include params 30582 1726855332.64618: variable 'interface' from source: play vars 30582 1726855332.64717: variable 'interface' from source: play vars 30582 1726855332.64720: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855332.64723: when evaluation is False, skipping this task 30582 1726855332.64725: _execute() done 30582 1726855332.64873: dumping result to json 30582 1726855332.64877: done dumping result, returning 30582 1726855332.64879: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-00000000146c] 30582 1726855332.64892: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146c 30582 1726855332.64958: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146c 30582 1726855332.64961: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855332.65007: no more pending results, returning what we have 30582 1726855332.65011: results queue empty 30582 1726855332.65012: checking for any_errors_fatal 30582 1726855332.65028: done checking for any_errors_fatal 30582 1726855332.65029: checking for max_fail_percentage 30582 1726855332.65031: done checking for max_fail_percentage 30582 1726855332.65032: checking to see if all hosts have failed and the running result is not ok 30582 1726855332.65032: done checking to see if all hosts have failed 30582 1726855332.65033: getting the remaining hosts for this loop 30582 1726855332.65034: done getting the remaining hosts for this loop 30582 1726855332.65037: getting the next task for host managed_node3 30582 1726855332.65044: done getting next task for host managed_node3 30582 1726855332.65047: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855332.65052: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855332.65074: getting variables 30582 1726855332.65076: in VariableManager get_vars() 30582 1726855332.65111: Calling all_inventory to load vars for managed_node3 30582 1726855332.65114: Calling groups_inventory to load vars for managed_node3 30582 1726855332.65115: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855332.65123: Calling all_plugins_play to load vars for managed_node3 30582 1726855332.65125: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855332.65128: Calling groups_plugins_play to load vars for managed_node3 30582 1726855332.67200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855332.68145: done with get_vars() 30582 1726855332.68163: done getting variables 30582 1726855332.68217: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:12 -0400 (0:00:00.093) 0:01:09.032 ****** 30582 1726855332.68246: entering _queue_task() for managed_node3/service 30582 1726855332.68718: worker is 1 (out of 1 available) 30582 1726855332.68734: exiting _queue_task() for managed_node3/service 30582 1726855332.68760: done queuing things up, now waiting for results queue to drain 30582 1726855332.68762: waiting for pending results... 30582 1726855332.69626: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855332.69632: in run() - task 0affcc66-ac2b-aa83-7d57-00000000146d 30582 1726855332.69637: variable 'ansible_search_path' from source: unknown 30582 1726855332.69721: variable 'ansible_search_path' from source: unknown 30582 1726855332.69726: calling self._execute() 30582 1726855332.69820: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855332.69843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855332.69859: variable 'omit' from source: magic vars 30582 1726855332.70290: variable 'ansible_distribution_major_version' from source: facts 30582 1726855332.70308: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855332.70432: variable 'network_provider' from source: set_fact 30582 1726855332.70443: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855332.70450: when evaluation is False, skipping this task 30582 1726855332.70457: _execute() done 30582 1726855332.70494: dumping result to json 30582 1726855332.70497: done dumping result, returning 30582 1726855332.70500: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-00000000146d] 30582 1726855332.70503: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146d skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855332.70743: no more pending results, returning what we have 30582 1726855332.70747: results queue empty 30582 1726855332.70748: checking for any_errors_fatal 30582 1726855332.70757: done checking for any_errors_fatal 30582 1726855332.70758: checking for max_fail_percentage 30582 1726855332.70760: done checking for max_fail_percentage 30582 1726855332.70761: checking to see if all hosts have failed and the running result is not ok 30582 1726855332.70762: done checking to see if all hosts have failed 30582 1726855332.70763: getting the remaining hosts for this loop 30582 1726855332.70764: done getting the remaining hosts for this loop 30582 1726855332.70768: getting the next task for host managed_node3 30582 1726855332.70778: done getting next task for host managed_node3 30582 1726855332.70782: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855332.70790: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855332.70861: getting variables 30582 1726855332.70863: in VariableManager get_vars() 30582 1726855332.70908: Calling all_inventory to load vars for managed_node3 30582 1726855332.70911: Calling groups_inventory to load vars for managed_node3 30582 1726855332.70914: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855332.70926: Calling all_plugins_play to load vars for managed_node3 30582 1726855332.70928: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855332.70931: Calling groups_plugins_play to load vars for managed_node3 30582 1726855332.71528: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146d 30582 1726855332.71532: WORKER PROCESS EXITING 30582 1726855332.72383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855332.75331: done with get_vars() 30582 1726855332.75366: done getting variables 30582 1726855332.75438: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:12 -0400 (0:00:00.072) 0:01:09.104 ****** 30582 1726855332.75482: entering _queue_task() for managed_node3/copy 30582 1726855332.76265: worker is 1 (out of 1 available) 30582 1726855332.76283: exiting _queue_task() for managed_node3/copy 30582 1726855332.76399: done queuing things up, now waiting for results queue to drain 30582 1726855332.76401: waiting for pending results... 30582 1726855332.76892: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855332.77295: in run() - task 0affcc66-ac2b-aa83-7d57-00000000146e 30582 1726855332.77300: variable 'ansible_search_path' from source: unknown 30582 1726855332.77303: variable 'ansible_search_path' from source: unknown 30582 1726855332.77305: calling self._execute() 30582 1726855332.77398: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855332.77402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855332.77412: variable 'omit' from source: magic vars 30582 1726855332.77827: variable 'ansible_distribution_major_version' from source: facts 30582 1726855332.77843: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855332.77976: variable 'network_provider' from source: set_fact 30582 1726855332.77990: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855332.77998: when evaluation is False, skipping this task 30582 1726855332.78006: _execute() done 30582 1726855332.78014: dumping result to json 30582 1726855332.78020: done dumping result, returning 30582 1726855332.78033: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-00000000146e] 30582 1726855332.78044: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146e skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855332.78227: no more pending results, returning what we have 30582 1726855332.78231: results queue empty 30582 1726855332.78232: checking for any_errors_fatal 30582 1726855332.78237: done checking for any_errors_fatal 30582 1726855332.78238: checking for max_fail_percentage 30582 1726855332.78240: done checking for max_fail_percentage 30582 1726855332.78241: checking to see if all hosts have failed and the running result is not ok 30582 1726855332.78242: done checking to see if all hosts have failed 30582 1726855332.78243: getting the remaining hosts for this loop 30582 1726855332.78245: done getting the remaining hosts for this loop 30582 1726855332.78249: getting the next task for host managed_node3 30582 1726855332.78259: done getting next task for host managed_node3 30582 1726855332.78263: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855332.78271: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855332.78301: getting variables 30582 1726855332.78303: in VariableManager get_vars() 30582 1726855332.78345: Calling all_inventory to load vars for managed_node3 30582 1726855332.78348: Calling groups_inventory to load vars for managed_node3 30582 1726855332.78351: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855332.78363: Calling all_plugins_play to load vars for managed_node3 30582 1726855332.78367: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855332.78373: Calling groups_plugins_play to load vars for managed_node3 30582 1726855332.79091: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146e 30582 1726855332.79095: WORKER PROCESS EXITING 30582 1726855332.80294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855332.81884: done with get_vars() 30582 1726855332.81917: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:12 -0400 (0:00:00.065) 0:01:09.170 ****** 30582 1726855332.82012: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855332.82400: worker is 1 (out of 1 available) 30582 1726855332.82415: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855332.82429: done queuing things up, now waiting for results queue to drain 30582 1726855332.82430: waiting for pending results... 30582 1726855332.82710: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855332.82846: in run() - task 0affcc66-ac2b-aa83-7d57-00000000146f 30582 1726855332.82871: variable 'ansible_search_path' from source: unknown 30582 1726855332.82880: variable 'ansible_search_path' from source: unknown 30582 1726855332.82929: calling self._execute() 30582 1726855332.83094: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855332.83097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855332.83100: variable 'omit' from source: magic vars 30582 1726855332.83476: variable 'ansible_distribution_major_version' from source: facts 30582 1726855332.83495: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855332.83506: variable 'omit' from source: magic vars 30582 1726855332.83574: variable 'omit' from source: magic vars 30582 1726855332.83741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855332.85989: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855332.86075: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855332.86180: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855332.86183: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855332.86186: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855332.86271: variable 'network_provider' from source: set_fact 30582 1726855332.86420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855332.86453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855332.86488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855332.86538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855332.86558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855332.86645: variable 'omit' from source: magic vars 30582 1726855332.86771: variable 'omit' from source: magic vars 30582 1726855332.86884: variable 'network_connections' from source: include params 30582 1726855332.86937: variable 'interface' from source: play vars 30582 1726855332.86976: variable 'interface' from source: play vars 30582 1726855332.87130: variable 'omit' from source: magic vars 30582 1726855332.87143: variable '__lsr_ansible_managed' from source: task vars 30582 1726855332.87213: variable '__lsr_ansible_managed' from source: task vars 30582 1726855332.87428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855332.87699: Loaded config def from plugin (lookup/template) 30582 1726855332.87703: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855332.87705: File lookup term: get_ansible_managed.j2 30582 1726855332.87710: variable 'ansible_search_path' from source: unknown 30582 1726855332.87720: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855332.87737: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855332.87760: variable 'ansible_search_path' from source: unknown 30582 1726855332.95448: variable 'ansible_managed' from source: unknown 30582 1726855332.95642: variable 'omit' from source: magic vars 30582 1726855332.95659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855332.95695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855332.95748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855332.95752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855332.95760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855332.95799: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855332.95808: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855332.95817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855332.95966: Set connection var ansible_timeout to 10 30582 1726855332.95972: Set connection var ansible_connection to ssh 30582 1726855332.95975: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855332.95977: Set connection var ansible_pipelining to False 30582 1726855332.95979: Set connection var ansible_shell_executable to /bin/sh 30582 1726855332.95981: Set connection var ansible_shell_type to sh 30582 1726855332.96006: variable 'ansible_shell_executable' from source: unknown 30582 1726855332.96014: variable 'ansible_connection' from source: unknown 30582 1726855332.96078: variable 'ansible_module_compression' from source: unknown 30582 1726855332.96081: variable 'ansible_shell_type' from source: unknown 30582 1726855332.96083: variable 'ansible_shell_executable' from source: unknown 30582 1726855332.96085: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855332.96091: variable 'ansible_pipelining' from source: unknown 30582 1726855332.96093: variable 'ansible_timeout' from source: unknown 30582 1726855332.96095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855332.96211: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855332.96237: variable 'omit' from source: magic vars 30582 1726855332.96249: starting attempt loop 30582 1726855332.96258: running the handler 30582 1726855332.96297: _low_level_execute_command(): starting 30582 1726855332.96300: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855332.97041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855332.97108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855332.97113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855332.97170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855332.97184: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855332.97228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855332.97260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855332.97286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855332.97498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855332.99141: stdout chunk (state=3): >>>/root <<< 30582 1726855332.99291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855332.99304: stdout chunk (state=3): >>><<< 30582 1726855332.99317: stderr chunk (state=3): >>><<< 30582 1726855332.99360: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855332.99570: _low_level_execute_command(): starting 30582 1726855332.99574: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236 `" && echo ansible-tmp-1726855332.9947717-33830-169333225979236="` echo /root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236 `" ) && sleep 0' 30582 1726855333.00811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855333.00821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855333.00832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.00848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855333.00873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855333.00879: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855333.00882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.01091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855333.01351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.01354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.03281: stdout chunk (state=3): >>>ansible-tmp-1726855332.9947717-33830-169333225979236=/root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236 <<< 30582 1726855333.03427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855333.03525: stderr chunk (state=3): >>><<< 30582 1726855333.03618: stdout chunk (state=3): >>><<< 30582 1726855333.03640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855332.9947717-33830-169333225979236=/root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855333.03689: variable 'ansible_module_compression' from source: unknown 30582 1726855333.03995: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855333.03999: variable 'ansible_facts' from source: unknown 30582 1726855333.04221: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/AnsiballZ_network_connections.py 30582 1726855333.04648: Sending initial data 30582 1726855333.04651: Sent initial data (168 bytes) 30582 1726855333.05564: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855333.05574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855333.05591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.05607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855333.05619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855333.05627: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855333.05635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.05649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855333.05657: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855333.05663: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855333.05674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855333.05680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.05760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855333.05778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.05872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.07552: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855333.07608: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855333.07684: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp7wy3wdyh /root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/AnsiballZ_network_connections.py <<< 30582 1726855333.07699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/AnsiballZ_network_connections.py" <<< 30582 1726855333.07770: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp7wy3wdyh" to remote "/root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/AnsiballZ_network_connections.py" <<< 30582 1726855333.09098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855333.09147: stderr chunk (state=3): >>><<< 30582 1726855333.09154: stdout chunk (state=3): >>><<< 30582 1726855333.09210: done transferring module to remote 30582 1726855333.09224: _low_level_execute_command(): starting 30582 1726855333.09227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/ /root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/AnsiballZ_network_connections.py && sleep 0' 30582 1726855333.10015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855333.10019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.10086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.11908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855333.11912: stdout chunk (state=3): >>><<< 30582 1726855333.11914: stderr chunk (state=3): >>><<< 30582 1726855333.11932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855333.11941: _low_level_execute_command(): starting 30582 1726855333.11950: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/AnsiballZ_network_connections.py && sleep 0' 30582 1726855333.12627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855333.12642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855333.12658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.12677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855333.12734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855333.12804: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.12862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855333.12891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855333.12906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.13011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.40421: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ac77cybr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ac77cybr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/2e08db44-6b45-462b-a24b-1e1d0b41e5c0: error=unknown <<< 30582 1726855333.40546: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855333.42366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855333.42403: stderr chunk (state=3): >>><<< 30582 1726855333.42407: stdout chunk (state=3): >>><<< 30582 1726855333.42423: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ac77cybr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ac77cybr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/2e08db44-6b45-462b-a24b-1e1d0b41e5c0: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855333.42450: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855333.42458: _low_level_execute_command(): starting 30582 1726855333.42463: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855332.9947717-33830-169333225979236/ > /dev/null 2>&1 && sleep 0' 30582 1726855333.42925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855333.42929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855333.42931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.42933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.42937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.42984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855333.42991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855333.43002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.43066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.44907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855333.44933: stderr chunk (state=3): >>><<< 30582 1726855333.44936: stdout chunk (state=3): >>><<< 30582 1726855333.44949: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855333.44958: handler run complete 30582 1726855333.44979: attempt loop complete, returning result 30582 1726855333.44984: _execute() done 30582 1726855333.44986: dumping result to json 30582 1726855333.44991: done dumping result, returning 30582 1726855333.45001: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-00000000146f] 30582 1726855333.45003: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146f 30582 1726855333.45106: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000146f 30582 1726855333.45110: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30582 1726855333.45202: no more pending results, returning what we have 30582 1726855333.45206: results queue empty 30582 1726855333.45207: checking for any_errors_fatal 30582 1726855333.45218: done checking for any_errors_fatal 30582 1726855333.45219: checking for max_fail_percentage 30582 1726855333.45220: done checking for max_fail_percentage 30582 1726855333.45221: checking to see if all hosts have failed and the running result is not ok 30582 1726855333.45222: done checking to see if all hosts have failed 30582 1726855333.45223: getting the remaining hosts for this loop 30582 1726855333.45224: done getting the remaining hosts for this loop 30582 1726855333.45227: getting the next task for host managed_node3 30582 1726855333.45234: done getting next task for host managed_node3 30582 1726855333.45237: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855333.45242: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855333.45253: getting variables 30582 1726855333.45255: in VariableManager get_vars() 30582 1726855333.45296: Calling all_inventory to load vars for managed_node3 30582 1726855333.45298: Calling groups_inventory to load vars for managed_node3 30582 1726855333.45300: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855333.45310: Calling all_plugins_play to load vars for managed_node3 30582 1726855333.45313: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855333.45315: Calling groups_plugins_play to load vars for managed_node3 30582 1726855333.46175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855333.47179: done with get_vars() 30582 1726855333.47198: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:13 -0400 (0:00:00.652) 0:01:09.822 ****** 30582 1726855333.47262: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855333.47531: worker is 1 (out of 1 available) 30582 1726855333.47547: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855333.47560: done queuing things up, now waiting for results queue to drain 30582 1726855333.47562: waiting for pending results... 30582 1726855333.47751: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855333.47854: in run() - task 0affcc66-ac2b-aa83-7d57-000000001470 30582 1726855333.47868: variable 'ansible_search_path' from source: unknown 30582 1726855333.47872: variable 'ansible_search_path' from source: unknown 30582 1726855333.47905: calling self._execute() 30582 1726855333.47970: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.47977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.47985: variable 'omit' from source: magic vars 30582 1726855333.48272: variable 'ansible_distribution_major_version' from source: facts 30582 1726855333.48283: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855333.48372: variable 'network_state' from source: role '' defaults 30582 1726855333.48385: Evaluated conditional (network_state != {}): False 30582 1726855333.48390: when evaluation is False, skipping this task 30582 1726855333.48392: _execute() done 30582 1726855333.48395: dumping result to json 30582 1726855333.48397: done dumping result, returning 30582 1726855333.48405: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-000000001470] 30582 1726855333.48410: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001470 30582 1726855333.48498: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001470 30582 1726855333.48501: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855333.48547: no more pending results, returning what we have 30582 1726855333.48551: results queue empty 30582 1726855333.48552: checking for any_errors_fatal 30582 1726855333.48562: done checking for any_errors_fatal 30582 1726855333.48563: checking for max_fail_percentage 30582 1726855333.48565: done checking for max_fail_percentage 30582 1726855333.48566: checking to see if all hosts have failed and the running result is not ok 30582 1726855333.48567: done checking to see if all hosts have failed 30582 1726855333.48567: getting the remaining hosts for this loop 30582 1726855333.48569: done getting the remaining hosts for this loop 30582 1726855333.48573: getting the next task for host managed_node3 30582 1726855333.48581: done getting next task for host managed_node3 30582 1726855333.48584: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855333.48591: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855333.48616: getting variables 30582 1726855333.48617: in VariableManager get_vars() 30582 1726855333.48654: Calling all_inventory to load vars for managed_node3 30582 1726855333.48658: Calling groups_inventory to load vars for managed_node3 30582 1726855333.48660: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855333.48670: Calling all_plugins_play to load vars for managed_node3 30582 1726855333.48673: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855333.48676: Calling groups_plugins_play to load vars for managed_node3 30582 1726855333.49467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855333.50351: done with get_vars() 30582 1726855333.50369: done getting variables 30582 1726855333.50417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:13 -0400 (0:00:00.031) 0:01:09.854 ****** 30582 1726855333.50445: entering _queue_task() for managed_node3/debug 30582 1726855333.50702: worker is 1 (out of 1 available) 30582 1726855333.50718: exiting _queue_task() for managed_node3/debug 30582 1726855333.50729: done queuing things up, now waiting for results queue to drain 30582 1726855333.50731: waiting for pending results... 30582 1726855333.50927: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855333.51013: in run() - task 0affcc66-ac2b-aa83-7d57-000000001471 30582 1726855333.51027: variable 'ansible_search_path' from source: unknown 30582 1726855333.51030: variable 'ansible_search_path' from source: unknown 30582 1726855333.51061: calling self._execute() 30582 1726855333.51139: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.51143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.51151: variable 'omit' from source: magic vars 30582 1726855333.51442: variable 'ansible_distribution_major_version' from source: facts 30582 1726855333.51451: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855333.51458: variable 'omit' from source: magic vars 30582 1726855333.51513: variable 'omit' from source: magic vars 30582 1726855333.51533: variable 'omit' from source: magic vars 30582 1726855333.51565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855333.51597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855333.51615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855333.51630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855333.51640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855333.51664: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855333.51667: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.51669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.51749: Set connection var ansible_timeout to 10 30582 1726855333.51753: Set connection var ansible_connection to ssh 30582 1726855333.51758: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855333.51762: Set connection var ansible_pipelining to False 30582 1726855333.51767: Set connection var ansible_shell_executable to /bin/sh 30582 1726855333.51772: Set connection var ansible_shell_type to sh 30582 1726855333.51792: variable 'ansible_shell_executable' from source: unknown 30582 1726855333.51795: variable 'ansible_connection' from source: unknown 30582 1726855333.51797: variable 'ansible_module_compression' from source: unknown 30582 1726855333.51800: variable 'ansible_shell_type' from source: unknown 30582 1726855333.51802: variable 'ansible_shell_executable' from source: unknown 30582 1726855333.51804: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.51808: variable 'ansible_pipelining' from source: unknown 30582 1726855333.51810: variable 'ansible_timeout' from source: unknown 30582 1726855333.51814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.51922: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855333.51934: variable 'omit' from source: magic vars 30582 1726855333.51937: starting attempt loop 30582 1726855333.51939: running the handler 30582 1726855333.52041: variable '__network_connections_result' from source: set_fact 30582 1726855333.52084: handler run complete 30582 1726855333.52099: attempt loop complete, returning result 30582 1726855333.52102: _execute() done 30582 1726855333.52104: dumping result to json 30582 1726855333.52107: done dumping result, returning 30582 1726855333.52115: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000001471] 30582 1726855333.52120: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001471 30582 1726855333.52210: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001471 30582 1726855333.52212: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 30582 1726855333.52279: no more pending results, returning what we have 30582 1726855333.52283: results queue empty 30582 1726855333.52284: checking for any_errors_fatal 30582 1726855333.52293: done checking for any_errors_fatal 30582 1726855333.52293: checking for max_fail_percentage 30582 1726855333.52295: done checking for max_fail_percentage 30582 1726855333.52296: checking to see if all hosts have failed and the running result is not ok 30582 1726855333.52297: done checking to see if all hosts have failed 30582 1726855333.52297: getting the remaining hosts for this loop 30582 1726855333.52299: done getting the remaining hosts for this loop 30582 1726855333.52303: getting the next task for host managed_node3 30582 1726855333.52314: done getting next task for host managed_node3 30582 1726855333.52318: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855333.52323: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855333.52335: getting variables 30582 1726855333.52336: in VariableManager get_vars() 30582 1726855333.52374: Calling all_inventory to load vars for managed_node3 30582 1726855333.52377: Calling groups_inventory to load vars for managed_node3 30582 1726855333.52379: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855333.52395: Calling all_plugins_play to load vars for managed_node3 30582 1726855333.52399: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855333.52402: Calling groups_plugins_play to load vars for managed_node3 30582 1726855333.53395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855333.54269: done with get_vars() 30582 1726855333.54294: done getting variables 30582 1726855333.54339: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:13 -0400 (0:00:00.039) 0:01:09.893 ****** 30582 1726855333.54373: entering _queue_task() for managed_node3/debug 30582 1726855333.54637: worker is 1 (out of 1 available) 30582 1726855333.54652: exiting _queue_task() for managed_node3/debug 30582 1726855333.54664: done queuing things up, now waiting for results queue to drain 30582 1726855333.54666: waiting for pending results... 30582 1726855333.54862: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855333.54961: in run() - task 0affcc66-ac2b-aa83-7d57-000000001472 30582 1726855333.54975: variable 'ansible_search_path' from source: unknown 30582 1726855333.54978: variable 'ansible_search_path' from source: unknown 30582 1726855333.55013: calling self._execute() 30582 1726855333.55089: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.55093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.55102: variable 'omit' from source: magic vars 30582 1726855333.55384: variable 'ansible_distribution_major_version' from source: facts 30582 1726855333.55395: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855333.55402: variable 'omit' from source: magic vars 30582 1726855333.55445: variable 'omit' from source: magic vars 30582 1726855333.55470: variable 'omit' from source: magic vars 30582 1726855333.55506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855333.55533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855333.55552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855333.55565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855333.55578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855333.55604: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855333.55607: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.55610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.55686: Set connection var ansible_timeout to 10 30582 1726855333.55691: Set connection var ansible_connection to ssh 30582 1726855333.55695: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855333.55700: Set connection var ansible_pipelining to False 30582 1726855333.55706: Set connection var ansible_shell_executable to /bin/sh 30582 1726855333.55708: Set connection var ansible_shell_type to sh 30582 1726855333.55725: variable 'ansible_shell_executable' from source: unknown 30582 1726855333.55728: variable 'ansible_connection' from source: unknown 30582 1726855333.55731: variable 'ansible_module_compression' from source: unknown 30582 1726855333.55733: variable 'ansible_shell_type' from source: unknown 30582 1726855333.55735: variable 'ansible_shell_executable' from source: unknown 30582 1726855333.55737: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.55739: variable 'ansible_pipelining' from source: unknown 30582 1726855333.55743: variable 'ansible_timeout' from source: unknown 30582 1726855333.55747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.55851: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855333.55861: variable 'omit' from source: magic vars 30582 1726855333.55868: starting attempt loop 30582 1726855333.55871: running the handler 30582 1726855333.55915: variable '__network_connections_result' from source: set_fact 30582 1726855333.55971: variable '__network_connections_result' from source: set_fact 30582 1726855333.56048: handler run complete 30582 1726855333.56064: attempt loop complete, returning result 30582 1726855333.56067: _execute() done 30582 1726855333.56070: dumping result to json 30582 1726855333.56076: done dumping result, returning 30582 1726855333.56084: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000001472] 30582 1726855333.56095: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001472 30582 1726855333.56180: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001472 30582 1726855333.56183: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30582 1726855333.56282: no more pending results, returning what we have 30582 1726855333.56285: results queue empty 30582 1726855333.56286: checking for any_errors_fatal 30582 1726855333.56296: done checking for any_errors_fatal 30582 1726855333.56297: checking for max_fail_percentage 30582 1726855333.56299: done checking for max_fail_percentage 30582 1726855333.56300: checking to see if all hosts have failed and the running result is not ok 30582 1726855333.56300: done checking to see if all hosts have failed 30582 1726855333.56301: getting the remaining hosts for this loop 30582 1726855333.56302: done getting the remaining hosts for this loop 30582 1726855333.56306: getting the next task for host managed_node3 30582 1726855333.56314: done getting next task for host managed_node3 30582 1726855333.56318: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855333.56322: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855333.56334: getting variables 30582 1726855333.56335: in VariableManager get_vars() 30582 1726855333.56370: Calling all_inventory to load vars for managed_node3 30582 1726855333.56372: Calling groups_inventory to load vars for managed_node3 30582 1726855333.56374: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855333.56385: Calling all_plugins_play to load vars for managed_node3 30582 1726855333.56395: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855333.56399: Calling groups_plugins_play to load vars for managed_node3 30582 1726855333.57217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855333.58103: done with get_vars() 30582 1726855333.58124: done getting variables 30582 1726855333.58170: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:13 -0400 (0:00:00.038) 0:01:09.931 ****** 30582 1726855333.58197: entering _queue_task() for managed_node3/debug 30582 1726855333.58463: worker is 1 (out of 1 available) 30582 1726855333.58479: exiting _queue_task() for managed_node3/debug 30582 1726855333.58494: done queuing things up, now waiting for results queue to drain 30582 1726855333.58495: waiting for pending results... 30582 1726855333.58684: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855333.58792: in run() - task 0affcc66-ac2b-aa83-7d57-000000001473 30582 1726855333.58804: variable 'ansible_search_path' from source: unknown 30582 1726855333.58808: variable 'ansible_search_path' from source: unknown 30582 1726855333.58838: calling self._execute() 30582 1726855333.58915: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.58919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.58927: variable 'omit' from source: magic vars 30582 1726855333.59211: variable 'ansible_distribution_major_version' from source: facts 30582 1726855333.59219: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855333.59305: variable 'network_state' from source: role '' defaults 30582 1726855333.59315: Evaluated conditional (network_state != {}): False 30582 1726855333.59318: when evaluation is False, skipping this task 30582 1726855333.59321: _execute() done 30582 1726855333.59324: dumping result to json 30582 1726855333.59327: done dumping result, returning 30582 1726855333.59336: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-000000001473] 30582 1726855333.59340: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001473 30582 1726855333.59430: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001473 30582 1726855333.59432: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855333.59481: no more pending results, returning what we have 30582 1726855333.59484: results queue empty 30582 1726855333.59485: checking for any_errors_fatal 30582 1726855333.59499: done checking for any_errors_fatal 30582 1726855333.59500: checking for max_fail_percentage 30582 1726855333.59502: done checking for max_fail_percentage 30582 1726855333.59503: checking to see if all hosts have failed and the running result is not ok 30582 1726855333.59503: done checking to see if all hosts have failed 30582 1726855333.59504: getting the remaining hosts for this loop 30582 1726855333.59505: done getting the remaining hosts for this loop 30582 1726855333.59509: getting the next task for host managed_node3 30582 1726855333.59517: done getting next task for host managed_node3 30582 1726855333.59521: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855333.59527: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855333.59551: getting variables 30582 1726855333.59553: in VariableManager get_vars() 30582 1726855333.59597: Calling all_inventory to load vars for managed_node3 30582 1726855333.59600: Calling groups_inventory to load vars for managed_node3 30582 1726855333.59602: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855333.59613: Calling all_plugins_play to load vars for managed_node3 30582 1726855333.59616: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855333.59618: Calling groups_plugins_play to load vars for managed_node3 30582 1726855333.60567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855333.61434: done with get_vars() 30582 1726855333.61457: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:13 -0400 (0:00:00.033) 0:01:09.965 ****** 30582 1726855333.61531: entering _queue_task() for managed_node3/ping 30582 1726855333.61798: worker is 1 (out of 1 available) 30582 1726855333.61811: exiting _queue_task() for managed_node3/ping 30582 1726855333.61823: done queuing things up, now waiting for results queue to drain 30582 1726855333.61824: waiting for pending results... 30582 1726855333.62021: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855333.62125: in run() - task 0affcc66-ac2b-aa83-7d57-000000001474 30582 1726855333.62137: variable 'ansible_search_path' from source: unknown 30582 1726855333.62141: variable 'ansible_search_path' from source: unknown 30582 1726855333.62171: calling self._execute() 30582 1726855333.62244: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.62248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.62256: variable 'omit' from source: magic vars 30582 1726855333.62541: variable 'ansible_distribution_major_version' from source: facts 30582 1726855333.62550: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855333.62556: variable 'omit' from source: magic vars 30582 1726855333.62603: variable 'omit' from source: magic vars 30582 1726855333.62624: variable 'omit' from source: magic vars 30582 1726855333.62655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855333.62685: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855333.62702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855333.62718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855333.62729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855333.62752: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855333.62756: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.62758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.62835: Set connection var ansible_timeout to 10 30582 1726855333.62838: Set connection var ansible_connection to ssh 30582 1726855333.62844: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855333.62849: Set connection var ansible_pipelining to False 30582 1726855333.62854: Set connection var ansible_shell_executable to /bin/sh 30582 1726855333.62856: Set connection var ansible_shell_type to sh 30582 1726855333.62877: variable 'ansible_shell_executable' from source: unknown 30582 1726855333.62880: variable 'ansible_connection' from source: unknown 30582 1726855333.62884: variable 'ansible_module_compression' from source: unknown 30582 1726855333.62886: variable 'ansible_shell_type' from source: unknown 30582 1726855333.62890: variable 'ansible_shell_executable' from source: unknown 30582 1726855333.62892: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.62895: variable 'ansible_pipelining' from source: unknown 30582 1726855333.62897: variable 'ansible_timeout' from source: unknown 30582 1726855333.62899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.63050: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855333.63060: variable 'omit' from source: magic vars 30582 1726855333.63065: starting attempt loop 30582 1726855333.63068: running the handler 30582 1726855333.63083: _low_level_execute_command(): starting 30582 1726855333.63091: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855333.63626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.63631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855333.63634: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855333.63637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.63678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855333.63682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855333.63703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.63765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.65461: stdout chunk (state=3): >>>/root <<< 30582 1726855333.65558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855333.65590: stderr chunk (state=3): >>><<< 30582 1726855333.65593: stdout chunk (state=3): >>><<< 30582 1726855333.65616: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855333.65631: _low_level_execute_command(): starting 30582 1726855333.65638: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126 `" && echo ansible-tmp-1726855333.656156-33860-33896673893126="` echo /root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126 `" ) && sleep 0' 30582 1726855333.66082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.66086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.66117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.66120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.66173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855333.66177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855333.66180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.66249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.68132: stdout chunk (state=3): >>>ansible-tmp-1726855333.656156-33860-33896673893126=/root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126 <<< 30582 1726855333.68237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855333.68264: stderr chunk (state=3): >>><<< 30582 1726855333.68270: stdout chunk (state=3): >>><<< 30582 1726855333.68289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855333.656156-33860-33896673893126=/root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855333.68331: variable 'ansible_module_compression' from source: unknown 30582 1726855333.68368: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855333.68403: variable 'ansible_facts' from source: unknown 30582 1726855333.68457: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/AnsiballZ_ping.py 30582 1726855333.68573: Sending initial data 30582 1726855333.68577: Sent initial data (151 bytes) 30582 1726855333.69040: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855333.69044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855333.69046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.69048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.69050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.69114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855333.69117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855333.69118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.69171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.70718: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855333.70773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855333.70832: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpz3ebj18a /root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/AnsiballZ_ping.py <<< 30582 1726855333.70835: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/AnsiballZ_ping.py" <<< 30582 1726855333.70893: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpz3ebj18a" to remote "/root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/AnsiballZ_ping.py" <<< 30582 1726855333.70896: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/AnsiballZ_ping.py" <<< 30582 1726855333.71477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855333.71526: stderr chunk (state=3): >>><<< 30582 1726855333.71529: stdout chunk (state=3): >>><<< 30582 1726855333.71575: done transferring module to remote 30582 1726855333.71584: _low_level_execute_command(): starting 30582 1726855333.71590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/ /root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/AnsiballZ_ping.py && sleep 0' 30582 1726855333.72052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855333.72055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855333.72058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.72060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.72067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.72111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855333.72114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855333.72119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.72177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.73923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855333.73949: stderr chunk (state=3): >>><<< 30582 1726855333.73952: stdout chunk (state=3): >>><<< 30582 1726855333.73965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855333.73968: _low_level_execute_command(): starting 30582 1726855333.73980: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/AnsiballZ_ping.py && sleep 0' 30582 1726855333.74434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855333.74438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855333.74440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855333.74443: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855333.74445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.74488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855333.74493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.74562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.89354: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855333.90657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855333.90692: stderr chunk (state=3): >>><<< 30582 1726855333.90696: stdout chunk (state=3): >>><<< 30582 1726855333.90713: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855333.90734: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855333.90743: _low_level_execute_command(): starting 30582 1726855333.90748: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855333.656156-33860-33896673893126/ > /dev/null 2>&1 && sleep 0' 30582 1726855333.91183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855333.91194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855333.91221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.91224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855333.91226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855333.91284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855333.91291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855333.91301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855333.91356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855333.93195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855333.93220: stderr chunk (state=3): >>><<< 30582 1726855333.93223: stdout chunk (state=3): >>><<< 30582 1726855333.93238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855333.93244: handler run complete 30582 1726855333.93257: attempt loop complete, returning result 30582 1726855333.93259: _execute() done 30582 1726855333.93262: dumping result to json 30582 1726855333.93264: done dumping result, returning 30582 1726855333.93277: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-000000001474] 30582 1726855333.93285: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001474 30582 1726855333.93374: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001474 30582 1726855333.93376: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855333.93446: no more pending results, returning what we have 30582 1726855333.93450: results queue empty 30582 1726855333.93451: checking for any_errors_fatal 30582 1726855333.93461: done checking for any_errors_fatal 30582 1726855333.93462: checking for max_fail_percentage 30582 1726855333.93464: done checking for max_fail_percentage 30582 1726855333.93465: checking to see if all hosts have failed and the running result is not ok 30582 1726855333.93466: done checking to see if all hosts have failed 30582 1726855333.93466: getting the remaining hosts for this loop 30582 1726855333.93468: done getting the remaining hosts for this loop 30582 1726855333.93472: getting the next task for host managed_node3 30582 1726855333.93482: done getting next task for host managed_node3 30582 1726855333.93484: ^ task is: TASK: meta (role_complete) 30582 1726855333.93497: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855333.93510: getting variables 30582 1726855333.93512: in VariableManager get_vars() 30582 1726855333.93552: Calling all_inventory to load vars for managed_node3 30582 1726855333.93555: Calling groups_inventory to load vars for managed_node3 30582 1726855333.93557: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855333.93567: Calling all_plugins_play to load vars for managed_node3 30582 1726855333.93569: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855333.93572: Calling groups_plugins_play to load vars for managed_node3 30582 1726855333.94403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855333.95279: done with get_vars() 30582 1726855333.95299: done getting variables 30582 1726855333.95363: done queuing things up, now waiting for results queue to drain 30582 1726855333.95364: results queue empty 30582 1726855333.95365: checking for any_errors_fatal 30582 1726855333.95367: done checking for any_errors_fatal 30582 1726855333.95367: checking for max_fail_percentage 30582 1726855333.95368: done checking for max_fail_percentage 30582 1726855333.95369: checking to see if all hosts have failed and the running result is not ok 30582 1726855333.95369: done checking to see if all hosts have failed 30582 1726855333.95370: getting the remaining hosts for this loop 30582 1726855333.95371: done getting the remaining hosts for this loop 30582 1726855333.95373: getting the next task for host managed_node3 30582 1726855333.95378: done getting next task for host managed_node3 30582 1726855333.95380: ^ task is: TASK: Asserts 30582 1726855333.95381: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855333.95383: getting variables 30582 1726855333.95384: in VariableManager get_vars() 30582 1726855333.95394: Calling all_inventory to load vars for managed_node3 30582 1726855333.95395: Calling groups_inventory to load vars for managed_node3 30582 1726855333.95397: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855333.95401: Calling all_plugins_play to load vars for managed_node3 30582 1726855333.95402: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855333.95404: Calling groups_plugins_play to load vars for managed_node3 30582 1726855333.96123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855333.96968: done with get_vars() 30582 1726855333.96983: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 14:02:13 -0400 (0:00:00.355) 0:01:10.320 ****** 30582 1726855333.97037: entering _queue_task() for managed_node3/include_tasks 30582 1726855333.97353: worker is 1 (out of 1 available) 30582 1726855333.97367: exiting _queue_task() for managed_node3/include_tasks 30582 1726855333.97379: done queuing things up, now waiting for results queue to drain 30582 1726855333.97381: waiting for pending results... 30582 1726855333.97575: running TaskExecutor() for managed_node3/TASK: Asserts 30582 1726855333.97656: in run() - task 0affcc66-ac2b-aa83-7d57-00000000100a 30582 1726855333.97667: variable 'ansible_search_path' from source: unknown 30582 1726855333.97670: variable 'ansible_search_path' from source: unknown 30582 1726855333.97712: variable 'lsr_assert' from source: include params 30582 1726855333.97883: variable 'lsr_assert' from source: include params 30582 1726855333.97943: variable 'omit' from source: magic vars 30582 1726855333.98044: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.98053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.98058: variable 'omit' from source: magic vars 30582 1726855333.98231: variable 'ansible_distribution_major_version' from source: facts 30582 1726855333.98239: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855333.98244: variable 'item' from source: unknown 30582 1726855333.98296: variable 'item' from source: unknown 30582 1726855333.98318: variable 'item' from source: unknown 30582 1726855333.98361: variable 'item' from source: unknown 30582 1726855333.98503: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855333.98506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855333.98509: variable 'omit' from source: magic vars 30582 1726855333.98579: variable 'ansible_distribution_major_version' from source: facts 30582 1726855333.98582: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855333.98589: variable 'item' from source: unknown 30582 1726855333.98634: variable 'item' from source: unknown 30582 1726855333.98653: variable 'item' from source: unknown 30582 1726855333.98698: variable 'item' from source: unknown 30582 1726855333.98760: dumping result to json 30582 1726855333.98764: done dumping result, returning 30582 1726855333.98766: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcc66-ac2b-aa83-7d57-00000000100a] 30582 1726855333.98768: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000100a 30582 1726855333.98801: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000100a 30582 1726855333.98804: WORKER PROCESS EXITING 30582 1726855333.98827: no more pending results, returning what we have 30582 1726855333.98832: in VariableManager get_vars() 30582 1726855333.98874: Calling all_inventory to load vars for managed_node3 30582 1726855333.98876: Calling groups_inventory to load vars for managed_node3 30582 1726855333.98879: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855333.98897: Calling all_plugins_play to load vars for managed_node3 30582 1726855333.98901: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855333.98904: Calling groups_plugins_play to load vars for managed_node3 30582 1726855333.99734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855334.00690: done with get_vars() 30582 1726855334.00704: variable 'ansible_search_path' from source: unknown 30582 1726855334.00705: variable 'ansible_search_path' from source: unknown 30582 1726855334.00736: variable 'ansible_search_path' from source: unknown 30582 1726855334.00737: variable 'ansible_search_path' from source: unknown 30582 1726855334.00753: we have included files to process 30582 1726855334.00754: generating all_blocks data 30582 1726855334.00755: done generating all_blocks data 30582 1726855334.00760: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30582 1726855334.00760: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30582 1726855334.00762: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30582 1726855334.00838: in VariableManager get_vars() 30582 1726855334.00851: done with get_vars() 30582 1726855334.00927: done processing included file 30582 1726855334.00929: iterating over new_blocks loaded from include file 30582 1726855334.00929: in VariableManager get_vars() 30582 1726855334.00940: done with get_vars() 30582 1726855334.00941: filtering new block on tags 30582 1726855334.00964: done filtering new block on tags 30582 1726855334.00966: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 => (item=tasks/assert_device_present.yml) 30582 1726855334.00970: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30582 1726855334.00971: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30582 1726855334.00973: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30582 1726855334.01062: in VariableManager get_vars() 30582 1726855334.01077: done with get_vars() 30582 1726855334.01135: done processing included file 30582 1726855334.01136: iterating over new_blocks loaded from include file 30582 1726855334.01137: in VariableManager get_vars() 30582 1726855334.01146: done with get_vars() 30582 1726855334.01147: filtering new block on tags 30582 1726855334.01168: done filtering new block on tags 30582 1726855334.01170: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 => (item=tasks/assert_profile_absent.yml) 30582 1726855334.01173: extending task lists for all hosts with included blocks 30582 1726855334.01763: done extending task lists 30582 1726855334.01764: done processing included files 30582 1726855334.01764: results queue empty 30582 1726855334.01765: checking for any_errors_fatal 30582 1726855334.01766: done checking for any_errors_fatal 30582 1726855334.01766: checking for max_fail_percentage 30582 1726855334.01767: done checking for max_fail_percentage 30582 1726855334.01768: checking to see if all hosts have failed and the running result is not ok 30582 1726855334.01769: done checking to see if all hosts have failed 30582 1726855334.01769: getting the remaining hosts for this loop 30582 1726855334.01770: done getting the remaining hosts for this loop 30582 1726855334.01772: getting the next task for host managed_node3 30582 1726855334.01775: done getting next task for host managed_node3 30582 1726855334.01776: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30582 1726855334.01778: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855334.01785: getting variables 30582 1726855334.01786: in VariableManager get_vars() 30582 1726855334.01794: Calling all_inventory to load vars for managed_node3 30582 1726855334.01796: Calling groups_inventory to load vars for managed_node3 30582 1726855334.01797: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855334.01801: Calling all_plugins_play to load vars for managed_node3 30582 1726855334.01803: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855334.01804: Calling groups_plugins_play to load vars for managed_node3 30582 1726855334.02430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855334.03282: done with get_vars() 30582 1726855334.03302: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 14:02:14 -0400 (0:00:00.063) 0:01:10.383 ****** 30582 1726855334.03360: entering _queue_task() for managed_node3/include_tasks 30582 1726855334.03634: worker is 1 (out of 1 available) 30582 1726855334.03647: exiting _queue_task() for managed_node3/include_tasks 30582 1726855334.03659: done queuing things up, now waiting for results queue to drain 30582 1726855334.03660: waiting for pending results... 30582 1726855334.03866: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30582 1726855334.03948: in run() - task 0affcc66-ac2b-aa83-7d57-0000000015cf 30582 1726855334.03958: variable 'ansible_search_path' from source: unknown 30582 1726855334.03961: variable 'ansible_search_path' from source: unknown 30582 1726855334.03999: calling self._execute() 30582 1726855334.04066: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.04074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.04084: variable 'omit' from source: magic vars 30582 1726855334.04378: variable 'ansible_distribution_major_version' from source: facts 30582 1726855334.04389: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855334.04395: _execute() done 30582 1726855334.04398: dumping result to json 30582 1726855334.04401: done dumping result, returning 30582 1726855334.04407: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-aa83-7d57-0000000015cf] 30582 1726855334.04412: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000015cf 30582 1726855334.04502: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000015cf 30582 1726855334.04505: WORKER PROCESS EXITING 30582 1726855334.04559: no more pending results, returning what we have 30582 1726855334.04565: in VariableManager get_vars() 30582 1726855334.04611: Calling all_inventory to load vars for managed_node3 30582 1726855334.04615: Calling groups_inventory to load vars for managed_node3 30582 1726855334.04618: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855334.04632: Calling all_plugins_play to load vars for managed_node3 30582 1726855334.04635: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855334.04637: Calling groups_plugins_play to load vars for managed_node3 30582 1726855334.09577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855334.10430: done with get_vars() 30582 1726855334.10449: variable 'ansible_search_path' from source: unknown 30582 1726855334.10450: variable 'ansible_search_path' from source: unknown 30582 1726855334.10458: variable 'item' from source: include params 30582 1726855334.10526: variable 'item' from source: include params 30582 1726855334.10551: we have included files to process 30582 1726855334.10552: generating all_blocks data 30582 1726855334.10553: done generating all_blocks data 30582 1726855334.10553: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855334.10554: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855334.10555: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855334.10671: done processing included file 30582 1726855334.10673: iterating over new_blocks loaded from include file 30582 1726855334.10674: in VariableManager get_vars() 30582 1726855334.10689: done with get_vars() 30582 1726855334.10691: filtering new block on tags 30582 1726855334.10708: done filtering new block on tags 30582 1726855334.10709: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30582 1726855334.10713: extending task lists for all hosts with included blocks 30582 1726855334.10804: done extending task lists 30582 1726855334.10805: done processing included files 30582 1726855334.10806: results queue empty 30582 1726855334.10806: checking for any_errors_fatal 30582 1726855334.10808: done checking for any_errors_fatal 30582 1726855334.10809: checking for max_fail_percentage 30582 1726855334.10810: done checking for max_fail_percentage 30582 1726855334.10810: checking to see if all hosts have failed and the running result is not ok 30582 1726855334.10811: done checking to see if all hosts have failed 30582 1726855334.10811: getting the remaining hosts for this loop 30582 1726855334.10812: done getting the remaining hosts for this loop 30582 1726855334.10813: getting the next task for host managed_node3 30582 1726855334.10816: done getting next task for host managed_node3 30582 1726855334.10817: ^ task is: TASK: Get stat for interface {{ interface }} 30582 1726855334.10819: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855334.10821: getting variables 30582 1726855334.10821: in VariableManager get_vars() 30582 1726855334.10828: Calling all_inventory to load vars for managed_node3 30582 1726855334.10829: Calling groups_inventory to load vars for managed_node3 30582 1726855334.10831: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855334.10835: Calling all_plugins_play to load vars for managed_node3 30582 1726855334.10836: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855334.10838: Calling groups_plugins_play to load vars for managed_node3 30582 1726855334.11494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855334.12349: done with get_vars() 30582 1726855334.12367: done getting variables 30582 1726855334.12466: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 14:02:14 -0400 (0:00:00.091) 0:01:10.474 ****** 30582 1726855334.12486: entering _queue_task() for managed_node3/stat 30582 1726855334.12771: worker is 1 (out of 1 available) 30582 1726855334.12784: exiting _queue_task() for managed_node3/stat 30582 1726855334.12797: done queuing things up, now waiting for results queue to drain 30582 1726855334.12800: waiting for pending results... 30582 1726855334.13004: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30582 1726855334.13097: in run() - task 0affcc66-ac2b-aa83-7d57-000000001647 30582 1726855334.13109: variable 'ansible_search_path' from source: unknown 30582 1726855334.13113: variable 'ansible_search_path' from source: unknown 30582 1726855334.13142: calling self._execute() 30582 1726855334.13215: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.13219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.13227: variable 'omit' from source: magic vars 30582 1726855334.13525: variable 'ansible_distribution_major_version' from source: facts 30582 1726855334.13535: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855334.13541: variable 'omit' from source: magic vars 30582 1726855334.13583: variable 'omit' from source: magic vars 30582 1726855334.13653: variable 'interface' from source: play vars 30582 1726855334.13667: variable 'omit' from source: magic vars 30582 1726855334.13704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855334.13731: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855334.13747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855334.13761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855334.13773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855334.13799: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855334.13802: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.13807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.13882: Set connection var ansible_timeout to 10 30582 1726855334.13885: Set connection var ansible_connection to ssh 30582 1726855334.13891: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855334.13897: Set connection var ansible_pipelining to False 30582 1726855334.13902: Set connection var ansible_shell_executable to /bin/sh 30582 1726855334.13913: Set connection var ansible_shell_type to sh 30582 1726855334.14009: variable 'ansible_shell_executable' from source: unknown 30582 1726855334.14013: variable 'ansible_connection' from source: unknown 30582 1726855334.14016: variable 'ansible_module_compression' from source: unknown 30582 1726855334.14018: variable 'ansible_shell_type' from source: unknown 30582 1726855334.14021: variable 'ansible_shell_executable' from source: unknown 30582 1726855334.14023: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.14028: variable 'ansible_pipelining' from source: unknown 30582 1726855334.14030: variable 'ansible_timeout' from source: unknown 30582 1726855334.14032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.14224: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855334.14228: variable 'omit' from source: magic vars 30582 1726855334.14231: starting attempt loop 30582 1726855334.14233: running the handler 30582 1726855334.14235: _low_level_execute_command(): starting 30582 1726855334.14238: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855334.14901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855334.14919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.14934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855334.14949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.14977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.15026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.15030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855334.15034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.15108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.16820: stdout chunk (state=3): >>>/root <<< 30582 1726855334.16920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.16956: stderr chunk (state=3): >>><<< 30582 1726855334.16959: stdout chunk (state=3): >>><<< 30582 1726855334.16979: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855334.17003: _low_level_execute_command(): starting 30582 1726855334.17009: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485 `" && echo ansible-tmp-1726855334.1698468-33877-118783133837485="` echo /root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485 `" ) && sleep 0' 30582 1726855334.17473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.17476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.17490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855334.17493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855334.17495: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.17540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.17546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855334.17549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.17617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.19536: stdout chunk (state=3): >>>ansible-tmp-1726855334.1698468-33877-118783133837485=/root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485 <<< 30582 1726855334.19643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.19673: stderr chunk (state=3): >>><<< 30582 1726855334.19677: stdout chunk (state=3): >>><<< 30582 1726855334.19692: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855334.1698468-33877-118783133837485=/root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855334.19737: variable 'ansible_module_compression' from source: unknown 30582 1726855334.19784: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855334.19821: variable 'ansible_facts' from source: unknown 30582 1726855334.19882: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/AnsiballZ_stat.py 30582 1726855334.19991: Sending initial data 30582 1726855334.19995: Sent initial data (153 bytes) 30582 1726855334.20455: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.20460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855334.20462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.20464: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.20466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.20520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.20527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855334.20529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.20586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.22173: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30582 1726855334.22177: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855334.22231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855334.22296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp9v3htb0k /root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/AnsiballZ_stat.py <<< 30582 1726855334.22300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/AnsiballZ_stat.py" <<< 30582 1726855334.22352: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp9v3htb0k" to remote "/root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/AnsiballZ_stat.py" <<< 30582 1726855334.22356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/AnsiballZ_stat.py" <<< 30582 1726855334.22977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.23015: stderr chunk (state=3): >>><<< 30582 1726855334.23018: stdout chunk (state=3): >>><<< 30582 1726855334.23047: done transferring module to remote 30582 1726855334.23057: _low_level_execute_command(): starting 30582 1726855334.23063: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/ /root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/AnsiballZ_stat.py && sleep 0' 30582 1726855334.23530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.23533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.23536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.23542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855334.23544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.23585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.23591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855334.23594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.23663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.25531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.25558: stderr chunk (state=3): >>><<< 30582 1726855334.25562: stdout chunk (state=3): >>><<< 30582 1726855334.25582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855334.25586: _low_level_execute_command(): starting 30582 1726855334.25590: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/AnsiballZ_stat.py && sleep 0' 30582 1726855334.26051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.26054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855334.26057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855334.26059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.26061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.26118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.26124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855334.26127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.26191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.41627: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32492, "dev": 23, "nlink": 1, "atime": 1726855318.8417535, "mtime": 1726855318.8417535, "ctime": 1726855318.8417535, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855334.43098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855334.43103: stdout chunk (state=3): >>><<< 30582 1726855334.43106: stderr chunk (state=3): >>><<< 30582 1726855334.43108: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32492, "dev": 23, "nlink": 1, "atime": 1726855318.8417535, "mtime": 1726855318.8417535, "ctime": 1726855318.8417535, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855334.43130: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855334.43150: _low_level_execute_command(): starting 30582 1726855334.43159: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855334.1698468-33877-118783133837485/ > /dev/null 2>&1 && sleep 0' 30582 1726855334.43647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.43674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.43722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.43725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.43797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.45897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.45901: stdout chunk (state=3): >>><<< 30582 1726855334.45904: stderr chunk (state=3): >>><<< 30582 1726855334.45906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855334.45908: handler run complete 30582 1726855334.45910: attempt loop complete, returning result 30582 1726855334.45912: _execute() done 30582 1726855334.45913: dumping result to json 30582 1726855334.45915: done dumping result, returning 30582 1726855334.45916: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcc66-ac2b-aa83-7d57-000000001647] 30582 1726855334.45918: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001647 30582 1726855334.45997: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001647 30582 1726855334.46004: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726855318.8417535, "block_size": 4096, "blocks": 0, "ctime": 1726855318.8417535, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32492, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726855318.8417535, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30582 1726855334.46111: no more pending results, returning what we have 30582 1726855334.46117: results queue empty 30582 1726855334.46118: checking for any_errors_fatal 30582 1726855334.46119: done checking for any_errors_fatal 30582 1726855334.46120: checking for max_fail_percentage 30582 1726855334.46122: done checking for max_fail_percentage 30582 1726855334.46123: checking to see if all hosts have failed and the running result is not ok 30582 1726855334.46123: done checking to see if all hosts have failed 30582 1726855334.46124: getting the remaining hosts for this loop 30582 1726855334.46125: done getting the remaining hosts for this loop 30582 1726855334.46129: getting the next task for host managed_node3 30582 1726855334.46137: done getting next task for host managed_node3 30582 1726855334.46139: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30582 1726855334.46141: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855334.46146: getting variables 30582 1726855334.46147: in VariableManager get_vars() 30582 1726855334.46186: Calling all_inventory to load vars for managed_node3 30582 1726855334.46199: Calling groups_inventory to load vars for managed_node3 30582 1726855334.46202: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855334.46214: Calling all_plugins_play to load vars for managed_node3 30582 1726855334.46219: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855334.46223: Calling groups_plugins_play to load vars for managed_node3 30582 1726855334.47641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855334.48518: done with get_vars() 30582 1726855334.48536: done getting variables 30582 1726855334.48582: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855334.48674: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 14:02:14 -0400 (0:00:00.362) 0:01:10.836 ****** 30582 1726855334.48705: entering _queue_task() for managed_node3/assert 30582 1726855334.48970: worker is 1 (out of 1 available) 30582 1726855334.48985: exiting _queue_task() for managed_node3/assert 30582 1726855334.48999: done queuing things up, now waiting for results queue to drain 30582 1726855334.49001: waiting for pending results... 30582 1726855334.49425: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' 30582 1726855334.49431: in run() - task 0affcc66-ac2b-aa83-7d57-0000000015d0 30582 1726855334.49434: variable 'ansible_search_path' from source: unknown 30582 1726855334.49437: variable 'ansible_search_path' from source: unknown 30582 1726855334.49440: calling self._execute() 30582 1726855334.49538: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.49551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.49567: variable 'omit' from source: magic vars 30582 1726855334.49997: variable 'ansible_distribution_major_version' from source: facts 30582 1726855334.50010: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855334.50019: variable 'omit' from source: magic vars 30582 1726855334.50093: variable 'omit' from source: magic vars 30582 1726855334.50224: variable 'interface' from source: play vars 30582 1726855334.50251: variable 'omit' from source: magic vars 30582 1726855334.50309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855334.50338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855334.50355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855334.50376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855334.50382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855334.50408: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855334.50412: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.50415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.50490: Set connection var ansible_timeout to 10 30582 1726855334.50496: Set connection var ansible_connection to ssh 30582 1726855334.50499: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855334.50503: Set connection var ansible_pipelining to False 30582 1726855334.50508: Set connection var ansible_shell_executable to /bin/sh 30582 1726855334.50511: Set connection var ansible_shell_type to sh 30582 1726855334.50528: variable 'ansible_shell_executable' from source: unknown 30582 1726855334.50531: variable 'ansible_connection' from source: unknown 30582 1726855334.50534: variable 'ansible_module_compression' from source: unknown 30582 1726855334.50537: variable 'ansible_shell_type' from source: unknown 30582 1726855334.50539: variable 'ansible_shell_executable' from source: unknown 30582 1726855334.50542: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.50544: variable 'ansible_pipelining' from source: unknown 30582 1726855334.50547: variable 'ansible_timeout' from source: unknown 30582 1726855334.50549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.50660: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855334.50673: variable 'omit' from source: magic vars 30582 1726855334.50678: starting attempt loop 30582 1726855334.50681: running the handler 30582 1726855334.50779: variable 'interface_stat' from source: set_fact 30582 1726855334.50795: Evaluated conditional (interface_stat.stat.exists): True 30582 1726855334.50801: handler run complete 30582 1726855334.50814: attempt loop complete, returning result 30582 1726855334.50819: _execute() done 30582 1726855334.50821: dumping result to json 30582 1726855334.50823: done dumping result, returning 30582 1726855334.50829: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'statebr' [0affcc66-ac2b-aa83-7d57-0000000015d0] 30582 1726855334.50835: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000015d0 30582 1726855334.50922: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000015d0 30582 1726855334.50925: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855334.50976: no more pending results, returning what we have 30582 1726855334.50980: results queue empty 30582 1726855334.50981: checking for any_errors_fatal 30582 1726855334.50994: done checking for any_errors_fatal 30582 1726855334.50995: checking for max_fail_percentage 30582 1726855334.50996: done checking for max_fail_percentage 30582 1726855334.50997: checking to see if all hosts have failed and the running result is not ok 30582 1726855334.50998: done checking to see if all hosts have failed 30582 1726855334.50999: getting the remaining hosts for this loop 30582 1726855334.51001: done getting the remaining hosts for this loop 30582 1726855334.51004: getting the next task for host managed_node3 30582 1726855334.51014: done getting next task for host managed_node3 30582 1726855334.51017: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30582 1726855334.51022: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855334.51027: getting variables 30582 1726855334.51029: in VariableManager get_vars() 30582 1726855334.51069: Calling all_inventory to load vars for managed_node3 30582 1726855334.51072: Calling groups_inventory to load vars for managed_node3 30582 1726855334.51075: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855334.51086: Calling all_plugins_play to load vars for managed_node3 30582 1726855334.51096: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855334.51100: Calling groups_plugins_play to load vars for managed_node3 30582 1726855334.51919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855334.52929: done with get_vars() 30582 1726855334.52948: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 14:02:14 -0400 (0:00:00.043) 0:01:10.880 ****** 30582 1726855334.53021: entering _queue_task() for managed_node3/include_tasks 30582 1726855334.53296: worker is 1 (out of 1 available) 30582 1726855334.53310: exiting _queue_task() for managed_node3/include_tasks 30582 1726855334.53323: done queuing things up, now waiting for results queue to drain 30582 1726855334.53325: waiting for pending results... 30582 1726855334.53529: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30582 1726855334.53623: in run() - task 0affcc66-ac2b-aa83-7d57-0000000015d4 30582 1726855334.53632: variable 'ansible_search_path' from source: unknown 30582 1726855334.53636: variable 'ansible_search_path' from source: unknown 30582 1726855334.53666: calling self._execute() 30582 1726855334.53739: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.53744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.53752: variable 'omit' from source: magic vars 30582 1726855334.54037: variable 'ansible_distribution_major_version' from source: facts 30582 1726855334.54048: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855334.54052: _execute() done 30582 1726855334.54058: dumping result to json 30582 1726855334.54063: done dumping result, returning 30582 1726855334.54066: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-aa83-7d57-0000000015d4] 30582 1726855334.54075: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000015d4 30582 1726855334.54168: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000015d4 30582 1726855334.54171: WORKER PROCESS EXITING 30582 1726855334.54221: no more pending results, returning what we have 30582 1726855334.54226: in VariableManager get_vars() 30582 1726855334.54272: Calling all_inventory to load vars for managed_node3 30582 1726855334.54275: Calling groups_inventory to load vars for managed_node3 30582 1726855334.54278: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855334.54294: Calling all_plugins_play to load vars for managed_node3 30582 1726855334.54297: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855334.54300: Calling groups_plugins_play to load vars for managed_node3 30582 1726855334.55108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855334.55976: done with get_vars() 30582 1726855334.55993: variable 'ansible_search_path' from source: unknown 30582 1726855334.55994: variable 'ansible_search_path' from source: unknown 30582 1726855334.56001: variable 'item' from source: include params 30582 1726855334.56082: variable 'item' from source: include params 30582 1726855334.56108: we have included files to process 30582 1726855334.56109: generating all_blocks data 30582 1726855334.56110: done generating all_blocks data 30582 1726855334.56114: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855334.56115: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855334.56116: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855334.56717: done processing included file 30582 1726855334.56718: iterating over new_blocks loaded from include file 30582 1726855334.56719: in VariableManager get_vars() 30582 1726855334.56731: done with get_vars() 30582 1726855334.56732: filtering new block on tags 30582 1726855334.56773: done filtering new block on tags 30582 1726855334.56775: in VariableManager get_vars() 30582 1726855334.56789: done with get_vars() 30582 1726855334.56790: filtering new block on tags 30582 1726855334.56825: done filtering new block on tags 30582 1726855334.56826: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30582 1726855334.56830: extending task lists for all hosts with included blocks 30582 1726855334.56977: done extending task lists 30582 1726855334.56979: done processing included files 30582 1726855334.56979: results queue empty 30582 1726855334.56980: checking for any_errors_fatal 30582 1726855334.56983: done checking for any_errors_fatal 30582 1726855334.56984: checking for max_fail_percentage 30582 1726855334.56985: done checking for max_fail_percentage 30582 1726855334.56985: checking to see if all hosts have failed and the running result is not ok 30582 1726855334.56986: done checking to see if all hosts have failed 30582 1726855334.56986: getting the remaining hosts for this loop 30582 1726855334.56989: done getting the remaining hosts for this loop 30582 1726855334.56991: getting the next task for host managed_node3 30582 1726855334.56994: done getting next task for host managed_node3 30582 1726855334.56995: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855334.56997: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855334.57000: getting variables 30582 1726855334.57001: in VariableManager get_vars() 30582 1726855334.57008: Calling all_inventory to load vars for managed_node3 30582 1726855334.57010: Calling groups_inventory to load vars for managed_node3 30582 1726855334.57011: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855334.57016: Calling all_plugins_play to load vars for managed_node3 30582 1726855334.57017: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855334.57019: Calling groups_plugins_play to load vars for managed_node3 30582 1726855334.57676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855334.58528: done with get_vars() 30582 1726855334.58544: done getting variables 30582 1726855334.58574: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 14:02:14 -0400 (0:00:00.055) 0:01:10.935 ****** 30582 1726855334.58599: entering _queue_task() for managed_node3/set_fact 30582 1726855334.58873: worker is 1 (out of 1 available) 30582 1726855334.58885: exiting _queue_task() for managed_node3/set_fact 30582 1726855334.58897: done queuing things up, now waiting for results queue to drain 30582 1726855334.58899: waiting for pending results... 30582 1726855334.59098: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855334.59189: in run() - task 0affcc66-ac2b-aa83-7d57-000000001665 30582 1726855334.59203: variable 'ansible_search_path' from source: unknown 30582 1726855334.59206: variable 'ansible_search_path' from source: unknown 30582 1726855334.59236: calling self._execute() 30582 1726855334.59311: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.59315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.59324: variable 'omit' from source: magic vars 30582 1726855334.59611: variable 'ansible_distribution_major_version' from source: facts 30582 1726855334.59620: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855334.59625: variable 'omit' from source: magic vars 30582 1726855334.59668: variable 'omit' from source: magic vars 30582 1726855334.59696: variable 'omit' from source: magic vars 30582 1726855334.59729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855334.59757: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855334.59781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855334.59796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855334.59806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855334.59830: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855334.59833: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.59836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.59914: Set connection var ansible_timeout to 10 30582 1726855334.59918: Set connection var ansible_connection to ssh 30582 1726855334.59922: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855334.59927: Set connection var ansible_pipelining to False 30582 1726855334.59932: Set connection var ansible_shell_executable to /bin/sh 30582 1726855334.59935: Set connection var ansible_shell_type to sh 30582 1726855334.59952: variable 'ansible_shell_executable' from source: unknown 30582 1726855334.59955: variable 'ansible_connection' from source: unknown 30582 1726855334.59958: variable 'ansible_module_compression' from source: unknown 30582 1726855334.59960: variable 'ansible_shell_type' from source: unknown 30582 1726855334.59962: variable 'ansible_shell_executable' from source: unknown 30582 1726855334.59965: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.59969: variable 'ansible_pipelining' from source: unknown 30582 1726855334.59974: variable 'ansible_timeout' from source: unknown 30582 1726855334.59979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.60080: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855334.60096: variable 'omit' from source: magic vars 30582 1726855334.60099: starting attempt loop 30582 1726855334.60101: running the handler 30582 1726855334.60113: handler run complete 30582 1726855334.60121: attempt loop complete, returning result 30582 1726855334.60124: _execute() done 30582 1726855334.60126: dumping result to json 30582 1726855334.60129: done dumping result, returning 30582 1726855334.60136: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-aa83-7d57-000000001665] 30582 1726855334.60141: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001665 30582 1726855334.60224: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001665 30582 1726855334.60227: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30582 1726855334.60282: no more pending results, returning what we have 30582 1726855334.60285: results queue empty 30582 1726855334.60286: checking for any_errors_fatal 30582 1726855334.60289: done checking for any_errors_fatal 30582 1726855334.60290: checking for max_fail_percentage 30582 1726855334.60292: done checking for max_fail_percentage 30582 1726855334.60293: checking to see if all hosts have failed and the running result is not ok 30582 1726855334.60293: done checking to see if all hosts have failed 30582 1726855334.60294: getting the remaining hosts for this loop 30582 1726855334.60295: done getting the remaining hosts for this loop 30582 1726855334.60299: getting the next task for host managed_node3 30582 1726855334.60308: done getting next task for host managed_node3 30582 1726855334.60310: ^ task is: TASK: Stat profile file 30582 1726855334.60315: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855334.60320: getting variables 30582 1726855334.60322: in VariableManager get_vars() 30582 1726855334.60361: Calling all_inventory to load vars for managed_node3 30582 1726855334.60363: Calling groups_inventory to load vars for managed_node3 30582 1726855334.60366: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855334.60376: Calling all_plugins_play to load vars for managed_node3 30582 1726855334.60379: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855334.60382: Calling groups_plugins_play to load vars for managed_node3 30582 1726855334.61204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855334.62082: done with get_vars() 30582 1726855334.62102: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 14:02:14 -0400 (0:00:00.035) 0:01:10.971 ****** 30582 1726855334.62174: entering _queue_task() for managed_node3/stat 30582 1726855334.62441: worker is 1 (out of 1 available) 30582 1726855334.62455: exiting _queue_task() for managed_node3/stat 30582 1726855334.62467: done queuing things up, now waiting for results queue to drain 30582 1726855334.62469: waiting for pending results... 30582 1726855334.62656: running TaskExecutor() for managed_node3/TASK: Stat profile file 30582 1726855334.62740: in run() - task 0affcc66-ac2b-aa83-7d57-000000001666 30582 1726855334.62752: variable 'ansible_search_path' from source: unknown 30582 1726855334.62755: variable 'ansible_search_path' from source: unknown 30582 1726855334.62785: calling self._execute() 30582 1726855334.62860: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.62864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.62874: variable 'omit' from source: magic vars 30582 1726855334.63163: variable 'ansible_distribution_major_version' from source: facts 30582 1726855334.63176: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855334.63181: variable 'omit' from source: magic vars 30582 1726855334.63221: variable 'omit' from source: magic vars 30582 1726855334.63295: variable 'profile' from source: play vars 30582 1726855334.63299: variable 'interface' from source: play vars 30582 1726855334.63348: variable 'interface' from source: play vars 30582 1726855334.63362: variable 'omit' from source: magic vars 30582 1726855334.63398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855334.63424: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855334.63442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855334.63458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855334.63468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855334.63495: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855334.63498: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.63501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.63574: Set connection var ansible_timeout to 10 30582 1726855334.63577: Set connection var ansible_connection to ssh 30582 1726855334.63583: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855334.63589: Set connection var ansible_pipelining to False 30582 1726855334.63594: Set connection var ansible_shell_executable to /bin/sh 30582 1726855334.63598: Set connection var ansible_shell_type to sh 30582 1726855334.63614: variable 'ansible_shell_executable' from source: unknown 30582 1726855334.63616: variable 'ansible_connection' from source: unknown 30582 1726855334.63619: variable 'ansible_module_compression' from source: unknown 30582 1726855334.63621: variable 'ansible_shell_type' from source: unknown 30582 1726855334.63623: variable 'ansible_shell_executable' from source: unknown 30582 1726855334.63625: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855334.63630: variable 'ansible_pipelining' from source: unknown 30582 1726855334.63633: variable 'ansible_timeout' from source: unknown 30582 1726855334.63635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855334.63792: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855334.63799: variable 'omit' from source: magic vars 30582 1726855334.63805: starting attempt loop 30582 1726855334.63808: running the handler 30582 1726855334.63819: _low_level_execute_command(): starting 30582 1726855334.63826: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855334.64349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.64353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855334.64356: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.64409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.64412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855334.64415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.64489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.66197: stdout chunk (state=3): >>>/root <<< 30582 1726855334.66300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.66330: stderr chunk (state=3): >>><<< 30582 1726855334.66333: stdout chunk (state=3): >>><<< 30582 1726855334.66356: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855334.66370: _low_level_execute_command(): starting 30582 1726855334.66374: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828 `" && echo ansible-tmp-1726855334.6635635-33906-158489201701828="` echo /root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828 `" ) && sleep 0' 30582 1726855334.66850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.66853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.66864: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.66866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855334.66868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.66901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.66905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.66984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.68909: stdout chunk (state=3): >>>ansible-tmp-1726855334.6635635-33906-158489201701828=/root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828 <<< 30582 1726855334.69014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.69040: stderr chunk (state=3): >>><<< 30582 1726855334.69043: stdout chunk (state=3): >>><<< 30582 1726855334.69059: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855334.6635635-33906-158489201701828=/root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855334.69103: variable 'ansible_module_compression' from source: unknown 30582 1726855334.69150: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855334.69182: variable 'ansible_facts' from source: unknown 30582 1726855334.69247: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/AnsiballZ_stat.py 30582 1726855334.69353: Sending initial data 30582 1726855334.69357: Sent initial data (153 bytes) 30582 1726855334.69783: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.69792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855334.69815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.69818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855334.69821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.69878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.69886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.69939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.71496: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30582 1726855334.71505: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855334.71552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855334.71613: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxn7t2e53 /root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/AnsiballZ_stat.py <<< 30582 1726855334.71616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/AnsiballZ_stat.py" <<< 30582 1726855334.71670: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxn7t2e53" to remote "/root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/AnsiballZ_stat.py" <<< 30582 1726855334.71673: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/AnsiballZ_stat.py" <<< 30582 1726855334.72264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.72309: stderr chunk (state=3): >>><<< 30582 1726855334.72312: stdout chunk (state=3): >>><<< 30582 1726855334.72329: done transferring module to remote 30582 1726855334.72339: _low_level_execute_command(): starting 30582 1726855334.72341: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/ /root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/AnsiballZ_stat.py && sleep 0' 30582 1726855334.72795: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.72798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855334.72800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.72803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.72809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.72849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.72852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.72918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.74710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.74739: stderr chunk (state=3): >>><<< 30582 1726855334.74742: stdout chunk (state=3): >>><<< 30582 1726855334.74758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855334.74761: _low_level_execute_command(): starting 30582 1726855334.74766: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/AnsiballZ_stat.py && sleep 0' 30582 1726855334.75215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.75219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.75221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.75224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.75275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.75278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.75348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.90845: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855334.92231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855334.92249: stderr chunk (state=3): >>><<< 30582 1726855334.92253: stdout chunk (state=3): >>><<< 30582 1726855334.92270: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855334.92300: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855334.92308: _low_level_execute_command(): starting 30582 1726855334.92313: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855334.6635635-33906-158489201701828/ > /dev/null 2>&1 && sleep 0' 30582 1726855334.92749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855334.92753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.92755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855334.92757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855334.92765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855334.92814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855334.92817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855334.92819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855334.92879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855334.94828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855334.94832: stdout chunk (state=3): >>><<< 30582 1726855334.94835: stderr chunk (state=3): >>><<< 30582 1726855334.94837: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855334.94926: handler run complete 30582 1726855334.94929: attempt loop complete, returning result 30582 1726855334.94931: _execute() done 30582 1726855334.94934: dumping result to json 30582 1726855334.94936: done dumping result, returning 30582 1726855334.94938: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcc66-ac2b-aa83-7d57-000000001666] 30582 1726855334.95195: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001666 30582 1726855334.95275: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001666 30582 1726855334.95279: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855334.95344: no more pending results, returning what we have 30582 1726855334.95348: results queue empty 30582 1726855334.95349: checking for any_errors_fatal 30582 1726855334.95358: done checking for any_errors_fatal 30582 1726855334.95359: checking for max_fail_percentage 30582 1726855334.95361: done checking for max_fail_percentage 30582 1726855334.95362: checking to see if all hosts have failed and the running result is not ok 30582 1726855334.95363: done checking to see if all hosts have failed 30582 1726855334.95364: getting the remaining hosts for this loop 30582 1726855334.95366: done getting the remaining hosts for this loop 30582 1726855334.95373: getting the next task for host managed_node3 30582 1726855334.95382: done getting next task for host managed_node3 30582 1726855334.95385: ^ task is: TASK: Set NM profile exist flag based on the profile files 30582 1726855334.95393: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855334.95399: getting variables 30582 1726855334.95401: in VariableManager get_vars() 30582 1726855334.95446: Calling all_inventory to load vars for managed_node3 30582 1726855334.95449: Calling groups_inventory to load vars for managed_node3 30582 1726855334.95452: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855334.95470: Calling all_plugins_play to load vars for managed_node3 30582 1726855334.95474: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855334.95477: Calling groups_plugins_play to load vars for managed_node3 30582 1726855334.98237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855335.01853: done with get_vars() 30582 1726855335.01894: done getting variables 30582 1726855335.02117: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 14:02:15 -0400 (0:00:00.399) 0:01:11.371 ****** 30582 1726855335.02153: entering _queue_task() for managed_node3/set_fact 30582 1726855335.02625: worker is 1 (out of 1 available) 30582 1726855335.02637: exiting _queue_task() for managed_node3/set_fact 30582 1726855335.02651: done queuing things up, now waiting for results queue to drain 30582 1726855335.02653: waiting for pending results... 30582 1726855335.03009: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30582 1726855335.03148: in run() - task 0affcc66-ac2b-aa83-7d57-000000001667 30582 1726855335.03153: variable 'ansible_search_path' from source: unknown 30582 1726855335.03156: variable 'ansible_search_path' from source: unknown 30582 1726855335.03257: calling self._execute() 30582 1726855335.03401: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.03411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.03429: variable 'omit' from source: magic vars 30582 1726855335.03974: variable 'ansible_distribution_major_version' from source: facts 30582 1726855335.03978: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855335.04079: variable 'profile_stat' from source: set_fact 30582 1726855335.04098: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855335.04106: when evaluation is False, skipping this task 30582 1726855335.04113: _execute() done 30582 1726855335.04119: dumping result to json 30582 1726855335.04133: done dumping result, returning 30582 1726855335.04145: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-aa83-7d57-000000001667] 30582 1726855335.04155: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001667 30582 1726855335.04425: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001667 30582 1726855335.04428: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855335.04484: no more pending results, returning what we have 30582 1726855335.04491: results queue empty 30582 1726855335.04493: checking for any_errors_fatal 30582 1726855335.04504: done checking for any_errors_fatal 30582 1726855335.04505: checking for max_fail_percentage 30582 1726855335.04507: done checking for max_fail_percentage 30582 1726855335.04508: checking to see if all hosts have failed and the running result is not ok 30582 1726855335.04509: done checking to see if all hosts have failed 30582 1726855335.04509: getting the remaining hosts for this loop 30582 1726855335.04511: done getting the remaining hosts for this loop 30582 1726855335.04515: getting the next task for host managed_node3 30582 1726855335.04524: done getting next task for host managed_node3 30582 1726855335.04526: ^ task is: TASK: Get NM profile info 30582 1726855335.04532: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855335.04546: getting variables 30582 1726855335.04548: in VariableManager get_vars() 30582 1726855335.04593: Calling all_inventory to load vars for managed_node3 30582 1726855335.04596: Calling groups_inventory to load vars for managed_node3 30582 1726855335.04600: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855335.04614: Calling all_plugins_play to load vars for managed_node3 30582 1726855335.04618: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855335.04621: Calling groups_plugins_play to load vars for managed_node3 30582 1726855335.07161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855335.11097: done with get_vars() 30582 1726855335.11122: done getting variables 30582 1726855335.11368: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 14:02:15 -0400 (0:00:00.092) 0:01:11.463 ****** 30582 1726855335.11406: entering _queue_task() for managed_node3/shell 30582 1726855335.12177: worker is 1 (out of 1 available) 30582 1726855335.12192: exiting _queue_task() for managed_node3/shell 30582 1726855335.12204: done queuing things up, now waiting for results queue to drain 30582 1726855335.12206: waiting for pending results... 30582 1726855335.12697: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30582 1726855335.12899: in run() - task 0affcc66-ac2b-aa83-7d57-000000001668 30582 1726855335.12913: variable 'ansible_search_path' from source: unknown 30582 1726855335.12917: variable 'ansible_search_path' from source: unknown 30582 1726855335.12950: calling self._execute() 30582 1726855335.13045: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.13050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.13062: variable 'omit' from source: magic vars 30582 1726855335.13841: variable 'ansible_distribution_major_version' from source: facts 30582 1726855335.13857: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855335.13863: variable 'omit' from source: magic vars 30582 1726855335.14129: variable 'omit' from source: magic vars 30582 1726855335.14229: variable 'profile' from source: play vars 30582 1726855335.14233: variable 'interface' from source: play vars 30582 1726855335.14502: variable 'interface' from source: play vars 30582 1726855335.14530: variable 'omit' from source: magic vars 30582 1726855335.14564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855335.14642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855335.14648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855335.14651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855335.14654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855335.14690: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855335.14896: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.14899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.15077: Set connection var ansible_timeout to 10 30582 1726855335.15081: Set connection var ansible_connection to ssh 30582 1726855335.15083: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855335.15085: Set connection var ansible_pipelining to False 30582 1726855335.15089: Set connection var ansible_shell_executable to /bin/sh 30582 1726855335.15091: Set connection var ansible_shell_type to sh 30582 1726855335.15093: variable 'ansible_shell_executable' from source: unknown 30582 1726855335.15095: variable 'ansible_connection' from source: unknown 30582 1726855335.15097: variable 'ansible_module_compression' from source: unknown 30582 1726855335.15099: variable 'ansible_shell_type' from source: unknown 30582 1726855335.15100: variable 'ansible_shell_executable' from source: unknown 30582 1726855335.15102: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.15104: variable 'ansible_pipelining' from source: unknown 30582 1726855335.15106: variable 'ansible_timeout' from source: unknown 30582 1726855335.15108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.15799: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855335.15802: variable 'omit' from source: magic vars 30582 1726855335.15805: starting attempt loop 30582 1726855335.15807: running the handler 30582 1726855335.15809: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855335.15812: _low_level_execute_command(): starting 30582 1726855335.15814: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855335.16732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855335.16804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855335.16845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855335.16858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855335.17106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855335.17199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855335.18899: stdout chunk (state=3): >>>/root <<< 30582 1726855335.19004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855335.19052: stderr chunk (state=3): >>><<< 30582 1726855335.19056: stdout chunk (state=3): >>><<< 30582 1726855335.19086: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855335.19104: _low_level_execute_command(): starting 30582 1726855335.19111: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588 `" && echo ansible-tmp-1726855335.1908884-33927-46157138865588="` echo /root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588 `" ) && sleep 0' 30582 1726855335.20221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855335.20500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855335.20593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855335.20600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855335.22598: stdout chunk (state=3): >>>ansible-tmp-1726855335.1908884-33927-46157138865588=/root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588 <<< 30582 1726855335.22797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855335.22800: stdout chunk (state=3): >>><<< 30582 1726855335.22809: stderr chunk (state=3): >>><<< 30582 1726855335.22897: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855335.1908884-33927-46157138865588=/root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855335.22900: variable 'ansible_module_compression' from source: unknown 30582 1726855335.22922: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855335.22961: variable 'ansible_facts' from source: unknown 30582 1726855335.23294: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/AnsiballZ_command.py 30582 1726855335.23746: Sending initial data 30582 1726855335.23749: Sent initial data (155 bytes) 30582 1726855335.24708: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855335.24997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855335.25207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855335.25349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855335.26970: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855335.27026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855335.27137: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_4p_mcz9 /root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/AnsiballZ_command.py <<< 30582 1726855335.27140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/AnsiballZ_command.py" <<< 30582 1726855335.27168: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_4p_mcz9" to remote "/root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/AnsiballZ_command.py" <<< 30582 1726855335.28696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855335.28700: stderr chunk (state=3): >>><<< 30582 1726855335.28702: stdout chunk (state=3): >>><<< 30582 1726855335.28704: done transferring module to remote 30582 1726855335.28706: _low_level_execute_command(): starting 30582 1726855335.28708: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/ /root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/AnsiballZ_command.py && sleep 0' 30582 1726855335.29671: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855335.29695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855335.29712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855335.29737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855335.29756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855335.29816: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855335.29886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855335.29907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855335.29923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855335.30029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855335.31909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855335.31936: stderr chunk (state=3): >>><<< 30582 1726855335.31944: stdout chunk (state=3): >>><<< 30582 1726855335.31965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855335.31973: _low_level_execute_command(): starting 30582 1726855335.31983: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/AnsiballZ_command.py && sleep 0' 30582 1726855335.33009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855335.33071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855335.33158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855335.33179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855335.33336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855335.33441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855335.50445: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:02:15.485548", "end": "2024-09-20 14:02:15.503414", "delta": "0:00:00.017866", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855335.52020: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. <<< 30582 1726855335.52025: stdout chunk (state=3): >>><<< 30582 1726855335.52028: stderr chunk (state=3): >>><<< 30582 1726855335.52167: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:02:15.485548", "end": "2024-09-20 14:02:15.503414", "delta": "0:00:00.017866", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855335.52172: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855335.52175: _low_level_execute_command(): starting 30582 1726855335.52177: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855335.1908884-33927-46157138865588/ > /dev/null 2>&1 && sleep 0' 30582 1726855335.52727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855335.52746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855335.52760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855335.52778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855335.52856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855335.52904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855335.52928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855335.52950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855335.53059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855335.55095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855335.55099: stderr chunk (state=3): >>><<< 30582 1726855335.55101: stdout chunk (state=3): >>><<< 30582 1726855335.55104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855335.55106: handler run complete 30582 1726855335.55108: Evaluated conditional (False): False 30582 1726855335.55110: attempt loop complete, returning result 30582 1726855335.55112: _execute() done 30582 1726855335.55114: dumping result to json 30582 1726855335.55116: done dumping result, returning 30582 1726855335.55118: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcc66-ac2b-aa83-7d57-000000001668] 30582 1726855335.55119: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001668 30582 1726855335.55190: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001668 fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017866", "end": "2024-09-20 14:02:15.503414", "rc": 1, "start": "2024-09-20 14:02:15.485548" } MSG: non-zero return code ...ignoring 30582 1726855335.55267: no more pending results, returning what we have 30582 1726855335.55273: results queue empty 30582 1726855335.55274: checking for any_errors_fatal 30582 1726855335.55280: done checking for any_errors_fatal 30582 1726855335.55281: checking for max_fail_percentage 30582 1726855335.55283: done checking for max_fail_percentage 30582 1726855335.55284: checking to see if all hosts have failed and the running result is not ok 30582 1726855335.55285: done checking to see if all hosts have failed 30582 1726855335.55286: getting the remaining hosts for this loop 30582 1726855335.55291: done getting the remaining hosts for this loop 30582 1726855335.55296: getting the next task for host managed_node3 30582 1726855335.55306: done getting next task for host managed_node3 30582 1726855335.55309: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855335.55314: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855335.55321: getting variables 30582 1726855335.55322: in VariableManager get_vars() 30582 1726855335.55363: Calling all_inventory to load vars for managed_node3 30582 1726855335.55366: Calling groups_inventory to load vars for managed_node3 30582 1726855335.55371: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855335.55386: Calling all_plugins_play to load vars for managed_node3 30582 1726855335.55509: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855335.55515: Calling groups_plugins_play to load vars for managed_node3 30582 1726855335.56035: WORKER PROCESS EXITING 30582 1726855335.57031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855335.58664: done with get_vars() 30582 1726855335.58695: done getting variables 30582 1726855335.58759: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 14:02:15 -0400 (0:00:00.473) 0:01:11.937 ****** 30582 1726855335.58797: entering _queue_task() for managed_node3/set_fact 30582 1726855335.59179: worker is 1 (out of 1 available) 30582 1726855335.59195: exiting _queue_task() for managed_node3/set_fact 30582 1726855335.59208: done queuing things up, now waiting for results queue to drain 30582 1726855335.59210: waiting for pending results... 30582 1726855335.59507: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855335.59635: in run() - task 0affcc66-ac2b-aa83-7d57-000000001669 30582 1726855335.59655: variable 'ansible_search_path' from source: unknown 30582 1726855335.59661: variable 'ansible_search_path' from source: unknown 30582 1726855335.59702: calling self._execute() 30582 1726855335.59803: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.59989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.59995: variable 'omit' from source: magic vars 30582 1726855335.60176: variable 'ansible_distribution_major_version' from source: facts 30582 1726855335.60197: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855335.60340: variable 'nm_profile_exists' from source: set_fact 30582 1726855335.60358: Evaluated conditional (nm_profile_exists.rc == 0): False 30582 1726855335.60367: when evaluation is False, skipping this task 30582 1726855335.60377: _execute() done 30582 1726855335.60385: dumping result to json 30582 1726855335.60396: done dumping result, returning 30582 1726855335.60409: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-aa83-7d57-000000001669] 30582 1726855335.60419: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001669 30582 1726855335.60531: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001669 skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30582 1726855335.60612: no more pending results, returning what we have 30582 1726855335.60616: results queue empty 30582 1726855335.60617: checking for any_errors_fatal 30582 1726855335.60627: done checking for any_errors_fatal 30582 1726855335.60628: checking for max_fail_percentage 30582 1726855335.60630: done checking for max_fail_percentage 30582 1726855335.60631: checking to see if all hosts have failed and the running result is not ok 30582 1726855335.60632: done checking to see if all hosts have failed 30582 1726855335.60633: getting the remaining hosts for this loop 30582 1726855335.60634: done getting the remaining hosts for this loop 30582 1726855335.60638: getting the next task for host managed_node3 30582 1726855335.60650: done getting next task for host managed_node3 30582 1726855335.60653: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855335.60659: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855335.60664: getting variables 30582 1726855335.60666: in VariableManager get_vars() 30582 1726855335.60712: Calling all_inventory to load vars for managed_node3 30582 1726855335.60715: Calling groups_inventory to load vars for managed_node3 30582 1726855335.60719: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855335.60733: Calling all_plugins_play to load vars for managed_node3 30582 1726855335.60738: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855335.60741: Calling groups_plugins_play to load vars for managed_node3 30582 1726855335.61483: WORKER PROCESS EXITING 30582 1726855335.62948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855335.64471: done with get_vars() 30582 1726855335.64504: done getting variables 30582 1726855335.64565: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855335.64682: variable 'profile' from source: play vars 30582 1726855335.64686: variable 'interface' from source: play vars 30582 1726855335.64747: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 14:02:15 -0400 (0:00:00.059) 0:01:11.997 ****** 30582 1726855335.64781: entering _queue_task() for managed_node3/command 30582 1726855335.65140: worker is 1 (out of 1 available) 30582 1726855335.65154: exiting _queue_task() for managed_node3/command 30582 1726855335.65167: done queuing things up, now waiting for results queue to drain 30582 1726855335.65168: waiting for pending results... 30582 1726855335.65451: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30582 1726855335.65589: in run() - task 0affcc66-ac2b-aa83-7d57-00000000166b 30582 1726855335.65614: variable 'ansible_search_path' from source: unknown 30582 1726855335.65622: variable 'ansible_search_path' from source: unknown 30582 1726855335.65662: calling self._execute() 30582 1726855335.65759: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.65770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.65783: variable 'omit' from source: magic vars 30582 1726855335.66157: variable 'ansible_distribution_major_version' from source: facts 30582 1726855335.66174: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855335.66299: variable 'profile_stat' from source: set_fact 30582 1726855335.66315: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855335.66323: when evaluation is False, skipping this task 30582 1726855335.66330: _execute() done 30582 1726855335.66337: dumping result to json 30582 1726855335.66345: done dumping result, returning 30582 1726855335.66356: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000166b] 30582 1726855335.66368: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000166b skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855335.66526: no more pending results, returning what we have 30582 1726855335.66530: results queue empty 30582 1726855335.66531: checking for any_errors_fatal 30582 1726855335.66538: done checking for any_errors_fatal 30582 1726855335.66539: checking for max_fail_percentage 30582 1726855335.66541: done checking for max_fail_percentage 30582 1726855335.66542: checking to see if all hosts have failed and the running result is not ok 30582 1726855335.66543: done checking to see if all hosts have failed 30582 1726855335.66543: getting the remaining hosts for this loop 30582 1726855335.66545: done getting the remaining hosts for this loop 30582 1726855335.66548: getting the next task for host managed_node3 30582 1726855335.66556: done getting next task for host managed_node3 30582 1726855335.66559: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855335.66564: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855335.66569: getting variables 30582 1726855335.66571: in VariableManager get_vars() 30582 1726855335.66612: Calling all_inventory to load vars for managed_node3 30582 1726855335.66615: Calling groups_inventory to load vars for managed_node3 30582 1726855335.66618: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855335.66631: Calling all_plugins_play to load vars for managed_node3 30582 1726855335.66634: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855335.66636: Calling groups_plugins_play to load vars for managed_node3 30582 1726855335.67301: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000166b 30582 1726855335.67305: WORKER PROCESS EXITING 30582 1726855335.68428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855335.70010: done with get_vars() 30582 1726855335.70037: done getting variables 30582 1726855335.70093: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855335.70198: variable 'profile' from source: play vars 30582 1726855335.70202: variable 'interface' from source: play vars 30582 1726855335.70258: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 14:02:15 -0400 (0:00:00.055) 0:01:12.052 ****** 30582 1726855335.70291: entering _queue_task() for managed_node3/set_fact 30582 1726855335.70630: worker is 1 (out of 1 available) 30582 1726855335.70643: exiting _queue_task() for managed_node3/set_fact 30582 1726855335.70656: done queuing things up, now waiting for results queue to drain 30582 1726855335.70658: waiting for pending results... 30582 1726855335.70963: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30582 1726855335.71122: in run() - task 0affcc66-ac2b-aa83-7d57-00000000166c 30582 1726855335.71143: variable 'ansible_search_path' from source: unknown 30582 1726855335.71151: variable 'ansible_search_path' from source: unknown 30582 1726855335.71194: calling self._execute() 30582 1726855335.71293: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.71305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.71321: variable 'omit' from source: magic vars 30582 1726855335.71774: variable 'ansible_distribution_major_version' from source: facts 30582 1726855335.71794: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855335.71957: variable 'profile_stat' from source: set_fact 30582 1726855335.71974: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855335.71988: when evaluation is False, skipping this task 30582 1726855335.71996: _execute() done 30582 1726855335.72004: dumping result to json 30582 1726855335.72011: done dumping result, returning 30582 1726855335.72022: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000166c] 30582 1726855335.72032: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000166c 30582 1726855335.72294: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000166c 30582 1726855335.72297: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855335.72344: no more pending results, returning what we have 30582 1726855335.72349: results queue empty 30582 1726855335.72350: checking for any_errors_fatal 30582 1726855335.72359: done checking for any_errors_fatal 30582 1726855335.72360: checking for max_fail_percentage 30582 1726855335.72362: done checking for max_fail_percentage 30582 1726855335.72364: checking to see if all hosts have failed and the running result is not ok 30582 1726855335.72364: done checking to see if all hosts have failed 30582 1726855335.72365: getting the remaining hosts for this loop 30582 1726855335.72366: done getting the remaining hosts for this loop 30582 1726855335.72370: getting the next task for host managed_node3 30582 1726855335.72380: done getting next task for host managed_node3 30582 1726855335.72383: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30582 1726855335.72390: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855335.72394: getting variables 30582 1726855335.72396: in VariableManager get_vars() 30582 1726855335.72436: Calling all_inventory to load vars for managed_node3 30582 1726855335.72439: Calling groups_inventory to load vars for managed_node3 30582 1726855335.72443: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855335.72457: Calling all_plugins_play to load vars for managed_node3 30582 1726855335.72461: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855335.72464: Calling groups_plugins_play to load vars for managed_node3 30582 1726855335.74351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855335.75957: done with get_vars() 30582 1726855335.75983: done getting variables 30582 1726855335.76049: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855335.76163: variable 'profile' from source: play vars 30582 1726855335.76168: variable 'interface' from source: play vars 30582 1726855335.76229: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 14:02:15 -0400 (0:00:00.059) 0:01:12.112 ****** 30582 1726855335.76264: entering _queue_task() for managed_node3/command 30582 1726855335.76685: worker is 1 (out of 1 available) 30582 1726855335.76802: exiting _queue_task() for managed_node3/command 30582 1726855335.76814: done queuing things up, now waiting for results queue to drain 30582 1726855335.76815: waiting for pending results... 30582 1726855335.77020: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30582 1726855335.77157: in run() - task 0affcc66-ac2b-aa83-7d57-00000000166d 30582 1726855335.77179: variable 'ansible_search_path' from source: unknown 30582 1726855335.77186: variable 'ansible_search_path' from source: unknown 30582 1726855335.77227: calling self._execute() 30582 1726855335.77326: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.77336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.77492: variable 'omit' from source: magic vars 30582 1726855335.77790: variable 'ansible_distribution_major_version' from source: facts 30582 1726855335.77840: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855335.77969: variable 'profile_stat' from source: set_fact 30582 1726855335.77989: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855335.77996: when evaluation is False, skipping this task 30582 1726855335.78002: _execute() done 30582 1726855335.78008: dumping result to json 30582 1726855335.78013: done dumping result, returning 30582 1726855335.78021: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000166d] 30582 1726855335.78029: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000166d skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855335.78199: no more pending results, returning what we have 30582 1726855335.78203: results queue empty 30582 1726855335.78204: checking for any_errors_fatal 30582 1726855335.78214: done checking for any_errors_fatal 30582 1726855335.78215: checking for max_fail_percentage 30582 1726855335.78216: done checking for max_fail_percentage 30582 1726855335.78217: checking to see if all hosts have failed and the running result is not ok 30582 1726855335.78218: done checking to see if all hosts have failed 30582 1726855335.78218: getting the remaining hosts for this loop 30582 1726855335.78220: done getting the remaining hosts for this loop 30582 1726855335.78223: getting the next task for host managed_node3 30582 1726855335.78231: done getting next task for host managed_node3 30582 1726855335.78233: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30582 1726855335.78238: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855335.78242: getting variables 30582 1726855335.78244: in VariableManager get_vars() 30582 1726855335.78283: Calling all_inventory to load vars for managed_node3 30582 1726855335.78286: Calling groups_inventory to load vars for managed_node3 30582 1726855335.78291: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855335.78305: Calling all_plugins_play to load vars for managed_node3 30582 1726855335.78308: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855335.78311: Calling groups_plugins_play to load vars for managed_node3 30582 1726855335.79000: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000166d 30582 1726855335.79004: WORKER PROCESS EXITING 30582 1726855335.79959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855335.81701: done with get_vars() 30582 1726855335.81727: done getting variables 30582 1726855335.81786: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855335.81899: variable 'profile' from source: play vars 30582 1726855335.81903: variable 'interface' from source: play vars 30582 1726855335.81953: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 14:02:15 -0400 (0:00:00.057) 0:01:12.169 ****** 30582 1726855335.81983: entering _queue_task() for managed_node3/set_fact 30582 1726855335.82340: worker is 1 (out of 1 available) 30582 1726855335.82355: exiting _queue_task() for managed_node3/set_fact 30582 1726855335.82369: done queuing things up, now waiting for results queue to drain 30582 1726855335.82371: waiting for pending results... 30582 1726855335.82735: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30582 1726855335.82889: in run() - task 0affcc66-ac2b-aa83-7d57-00000000166e 30582 1726855335.83212: variable 'ansible_search_path' from source: unknown 30582 1726855335.83222: variable 'ansible_search_path' from source: unknown 30582 1726855335.83264: calling self._execute() 30582 1726855335.83506: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.83525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.83541: variable 'omit' from source: magic vars 30582 1726855335.84495: variable 'ansible_distribution_major_version' from source: facts 30582 1726855335.84499: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855335.84685: variable 'profile_stat' from source: set_fact 30582 1726855335.84762: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855335.84773: when evaluation is False, skipping this task 30582 1726855335.84800: _execute() done 30582 1726855335.84809: dumping result to json 30582 1726855335.84993: done dumping result, returning 30582 1726855335.84996: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000166e] 30582 1726855335.84999: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000166e skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855335.85143: no more pending results, returning what we have 30582 1726855335.85148: results queue empty 30582 1726855335.85150: checking for any_errors_fatal 30582 1726855335.85158: done checking for any_errors_fatal 30582 1726855335.85159: checking for max_fail_percentage 30582 1726855335.85162: done checking for max_fail_percentage 30582 1726855335.85163: checking to see if all hosts have failed and the running result is not ok 30582 1726855335.85163: done checking to see if all hosts have failed 30582 1726855335.85164: getting the remaining hosts for this loop 30582 1726855335.85166: done getting the remaining hosts for this loop 30582 1726855335.85170: getting the next task for host managed_node3 30582 1726855335.85181: done getting next task for host managed_node3 30582 1726855335.85185: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30582 1726855335.85191: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855335.85198: getting variables 30582 1726855335.85200: in VariableManager get_vars() 30582 1726855335.85244: Calling all_inventory to load vars for managed_node3 30582 1726855335.85247: Calling groups_inventory to load vars for managed_node3 30582 1726855335.85250: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855335.85265: Calling all_plugins_play to load vars for managed_node3 30582 1726855335.85269: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855335.85273: Calling groups_plugins_play to load vars for managed_node3 30582 1726855335.86195: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000166e 30582 1726855335.86199: WORKER PROCESS EXITING 30582 1726855335.88814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855335.90976: done with get_vars() 30582 1726855335.91006: done getting variables 30582 1726855335.91074: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855335.91203: variable 'profile' from source: play vars 30582 1726855335.91206: variable 'interface' from source: play vars 30582 1726855335.91256: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 14:02:15 -0400 (0:00:00.093) 0:01:12.262 ****** 30582 1726855335.91285: entering _queue_task() for managed_node3/assert 30582 1726855335.91653: worker is 1 (out of 1 available) 30582 1726855335.91667: exiting _queue_task() for managed_node3/assert 30582 1726855335.91679: done queuing things up, now waiting for results queue to drain 30582 1726855335.91681: waiting for pending results... 30582 1726855335.92143: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' 30582 1726855335.92489: in run() - task 0affcc66-ac2b-aa83-7d57-0000000015d5 30582 1726855335.92513: variable 'ansible_search_path' from source: unknown 30582 1726855335.92521: variable 'ansible_search_path' from source: unknown 30582 1726855335.92566: calling self._execute() 30582 1726855335.92742: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.92896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.92913: variable 'omit' from source: magic vars 30582 1726855335.94037: variable 'ansible_distribution_major_version' from source: facts 30582 1726855335.94040: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855335.94042: variable 'omit' from source: magic vars 30582 1726855335.94044: variable 'omit' from source: magic vars 30582 1726855335.94215: variable 'profile' from source: play vars 30582 1726855335.94225: variable 'interface' from source: play vars 30582 1726855335.94407: variable 'interface' from source: play vars 30582 1726855335.94433: variable 'omit' from source: magic vars 30582 1726855335.94598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855335.94640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855335.94666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855335.94689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855335.94711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855335.94994: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855335.94997: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.94999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.95057: Set connection var ansible_timeout to 10 30582 1726855335.95065: Set connection var ansible_connection to ssh 30582 1726855335.95076: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855335.95085: Set connection var ansible_pipelining to False 30582 1726855335.95099: Set connection var ansible_shell_executable to /bin/sh 30582 1726855335.95110: Set connection var ansible_shell_type to sh 30582 1726855335.95323: variable 'ansible_shell_executable' from source: unknown 30582 1726855335.95326: variable 'ansible_connection' from source: unknown 30582 1726855335.95328: variable 'ansible_module_compression' from source: unknown 30582 1726855335.95330: variable 'ansible_shell_type' from source: unknown 30582 1726855335.95332: variable 'ansible_shell_executable' from source: unknown 30582 1726855335.95334: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855335.95336: variable 'ansible_pipelining' from source: unknown 30582 1726855335.95338: variable 'ansible_timeout' from source: unknown 30582 1726855335.95340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855335.95508: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855335.95554: variable 'omit' from source: magic vars 30582 1726855335.95565: starting attempt loop 30582 1726855335.95572: running the handler 30582 1726855335.95849: variable 'lsr_net_profile_exists' from source: set_fact 30582 1726855335.95861: Evaluated conditional (not lsr_net_profile_exists): True 30582 1726855335.95873: handler run complete 30582 1726855335.95893: attempt loop complete, returning result 30582 1726855335.96085: _execute() done 30582 1726855335.96090: dumping result to json 30582 1726855335.96092: done dumping result, returning 30582 1726855335.96094: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' [0affcc66-ac2b-aa83-7d57-0000000015d5] 30582 1726855335.96097: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000015d5 30582 1726855335.96163: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000015d5 30582 1726855335.96166: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855335.96238: no more pending results, returning what we have 30582 1726855335.96241: results queue empty 30582 1726855335.96242: checking for any_errors_fatal 30582 1726855335.96251: done checking for any_errors_fatal 30582 1726855335.96252: checking for max_fail_percentage 30582 1726855335.96254: done checking for max_fail_percentage 30582 1726855335.96256: checking to see if all hosts have failed and the running result is not ok 30582 1726855335.96257: done checking to see if all hosts have failed 30582 1726855335.96258: getting the remaining hosts for this loop 30582 1726855335.96260: done getting the remaining hosts for this loop 30582 1726855335.96263: getting the next task for host managed_node3 30582 1726855335.96273: done getting next task for host managed_node3 30582 1726855335.96277: ^ task is: TASK: Conditional asserts 30582 1726855335.96280: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855335.96288: getting variables 30582 1726855335.96290: in VariableManager get_vars() 30582 1726855335.96331: Calling all_inventory to load vars for managed_node3 30582 1726855335.96335: Calling groups_inventory to load vars for managed_node3 30582 1726855335.96339: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855335.96351: Calling all_plugins_play to load vars for managed_node3 30582 1726855335.96355: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855335.96358: Calling groups_plugins_play to load vars for managed_node3 30582 1726855335.99569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855336.03207: done with get_vars() 30582 1726855336.03308: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 14:02:16 -0400 (0:00:00.122) 0:01:12.385 ****** 30582 1726855336.03582: entering _queue_task() for managed_node3/include_tasks 30582 1726855336.04157: worker is 1 (out of 1 available) 30582 1726855336.04172: exiting _queue_task() for managed_node3/include_tasks 30582 1726855336.04185: done queuing things up, now waiting for results queue to drain 30582 1726855336.04490: waiting for pending results... 30582 1726855336.04838: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30582 1726855336.05010: in run() - task 0affcc66-ac2b-aa83-7d57-00000000100b 30582 1726855336.05044: variable 'ansible_search_path' from source: unknown 30582 1726855336.05048: variable 'ansible_search_path' from source: unknown 30582 1726855336.05666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855336.10396: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855336.10514: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855336.10548: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855336.10631: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855336.10657: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855336.10855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855336.11007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855336.11032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855336.11076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855336.11091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855336.11595: dumping result to json 30582 1726855336.11599: done dumping result, returning 30582 1726855336.11703: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcc66-ac2b-aa83-7d57-00000000100b] 30582 1726855336.11706: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000100b 30582 1726855336.11780: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000100b 30582 1726855336.11783: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 30582 1726855336.11863: no more pending results, returning what we have 30582 1726855336.11867: results queue empty 30582 1726855336.11869: checking for any_errors_fatal 30582 1726855336.11876: done checking for any_errors_fatal 30582 1726855336.11877: checking for max_fail_percentage 30582 1726855336.11879: done checking for max_fail_percentage 30582 1726855336.11880: checking to see if all hosts have failed and the running result is not ok 30582 1726855336.11881: done checking to see if all hosts have failed 30582 1726855336.11882: getting the remaining hosts for this loop 30582 1726855336.11883: done getting the remaining hosts for this loop 30582 1726855336.11890: getting the next task for host managed_node3 30582 1726855336.11898: done getting next task for host managed_node3 30582 1726855336.11901: ^ task is: TASK: Success in test '{{ lsr_description }}' 30582 1726855336.11904: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855336.11909: getting variables 30582 1726855336.11911: in VariableManager get_vars() 30582 1726855336.11956: Calling all_inventory to load vars for managed_node3 30582 1726855336.11959: Calling groups_inventory to load vars for managed_node3 30582 1726855336.11962: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855336.11974: Calling all_plugins_play to load vars for managed_node3 30582 1726855336.11979: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855336.11982: Calling groups_plugins_play to load vars for managed_node3 30582 1726855336.15129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855336.18546: done with get_vars() 30582 1726855336.18582: done getting variables 30582 1726855336.18648: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855336.18979: variable 'lsr_description' from source: include params TASK [Success in test 'I can remove an existing profile without taking it down'] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 14:02:16 -0400 (0:00:00.154) 0:01:12.540 ****** 30582 1726855336.19015: entering _queue_task() for managed_node3/debug 30582 1726855336.19726: worker is 1 (out of 1 available) 30582 1726855336.19742: exiting _queue_task() for managed_node3/debug 30582 1726855336.19754: done queuing things up, now waiting for results queue to drain 30582 1726855336.19756: waiting for pending results... 30582 1726855336.20061: running TaskExecutor() for managed_node3/TASK: Success in test 'I can remove an existing profile without taking it down' 30582 1726855336.20174: in run() - task 0affcc66-ac2b-aa83-7d57-00000000100c 30582 1726855336.20203: variable 'ansible_search_path' from source: unknown 30582 1726855336.20222: variable 'ansible_search_path' from source: unknown 30582 1726855336.20324: calling self._execute() 30582 1726855336.20381: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855336.20395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855336.20411: variable 'omit' from source: magic vars 30582 1726855336.20818: variable 'ansible_distribution_major_version' from source: facts 30582 1726855336.20835: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855336.20848: variable 'omit' from source: magic vars 30582 1726855336.20901: variable 'omit' from source: magic vars 30582 1726855336.21007: variable 'lsr_description' from source: include params 30582 1726855336.21090: variable 'omit' from source: magic vars 30582 1726855336.21094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855336.21128: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855336.21155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855336.21178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855336.21205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855336.21241: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855336.21251: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855336.21259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855336.21380: Set connection var ansible_timeout to 10 30582 1726855336.21392: Set connection var ansible_connection to ssh 30582 1726855336.21414: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855336.21592: Set connection var ansible_pipelining to False 30582 1726855336.21595: Set connection var ansible_shell_executable to /bin/sh 30582 1726855336.21598: Set connection var ansible_shell_type to sh 30582 1726855336.21600: variable 'ansible_shell_executable' from source: unknown 30582 1726855336.21603: variable 'ansible_connection' from source: unknown 30582 1726855336.21605: variable 'ansible_module_compression' from source: unknown 30582 1726855336.21607: variable 'ansible_shell_type' from source: unknown 30582 1726855336.21609: variable 'ansible_shell_executable' from source: unknown 30582 1726855336.21611: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855336.21613: variable 'ansible_pipelining' from source: unknown 30582 1726855336.21614: variable 'ansible_timeout' from source: unknown 30582 1726855336.21616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855336.21653: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855336.21668: variable 'omit' from source: magic vars 30582 1726855336.21679: starting attempt loop 30582 1726855336.21685: running the handler 30582 1726855336.21744: handler run complete 30582 1726855336.21762: attempt loop complete, returning result 30582 1726855336.21769: _execute() done 30582 1726855336.21776: dumping result to json 30582 1726855336.21783: done dumping result, returning 30582 1726855336.21798: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can remove an existing profile without taking it down' [0affcc66-ac2b-aa83-7d57-00000000100c] 30582 1726855336.21809: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000100c ok: [managed_node3] => {} MSG: +++++ Success in test 'I can remove an existing profile without taking it down' +++++ 30582 1726855336.21999: no more pending results, returning what we have 30582 1726855336.22003: results queue empty 30582 1726855336.22005: checking for any_errors_fatal 30582 1726855336.22015: done checking for any_errors_fatal 30582 1726855336.22016: checking for max_fail_percentage 30582 1726855336.22018: done checking for max_fail_percentage 30582 1726855336.22019: checking to see if all hosts have failed and the running result is not ok 30582 1726855336.22020: done checking to see if all hosts have failed 30582 1726855336.22021: getting the remaining hosts for this loop 30582 1726855336.22022: done getting the remaining hosts for this loop 30582 1726855336.22027: getting the next task for host managed_node3 30582 1726855336.22036: done getting next task for host managed_node3 30582 1726855336.22040: ^ task is: TASK: Cleanup 30582 1726855336.22043: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855336.22050: getting variables 30582 1726855336.22166: in VariableManager get_vars() 30582 1726855336.22210: Calling all_inventory to load vars for managed_node3 30582 1726855336.22213: Calling groups_inventory to load vars for managed_node3 30582 1726855336.22217: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855336.22229: Calling all_plugins_play to load vars for managed_node3 30582 1726855336.22233: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855336.22237: Calling groups_plugins_play to load vars for managed_node3 30582 1726855336.22785: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000100c 30582 1726855336.22791: WORKER PROCESS EXITING 30582 1726855336.24007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855336.27070: done with get_vars() 30582 1726855336.27169: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 14:02:16 -0400 (0:00:00.082) 0:01:12.623 ****** 30582 1726855336.27316: entering _queue_task() for managed_node3/include_tasks 30582 1726855336.27721: worker is 1 (out of 1 available) 30582 1726855336.27737: exiting _queue_task() for managed_node3/include_tasks 30582 1726855336.27751: done queuing things up, now waiting for results queue to drain 30582 1726855336.27752: waiting for pending results... 30582 1726855336.27980: running TaskExecutor() for managed_node3/TASK: Cleanup 30582 1726855336.28095: in run() - task 0affcc66-ac2b-aa83-7d57-000000001010 30582 1726855336.28115: variable 'ansible_search_path' from source: unknown 30582 1726855336.28122: variable 'ansible_search_path' from source: unknown 30582 1726855336.28170: variable 'lsr_cleanup' from source: include params 30582 1726855336.28374: variable 'lsr_cleanup' from source: include params 30582 1726855336.28449: variable 'omit' from source: magic vars 30582 1726855336.28585: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855336.28601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855336.28615: variable 'omit' from source: magic vars 30582 1726855336.28865: variable 'ansible_distribution_major_version' from source: facts 30582 1726855336.28879: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855336.28892: variable 'item' from source: unknown 30582 1726855336.28958: variable 'item' from source: unknown 30582 1726855336.29000: variable 'item' from source: unknown 30582 1726855336.29070: variable 'item' from source: unknown 30582 1726855336.29392: dumping result to json 30582 1726855336.29397: done dumping result, returning 30582 1726855336.29400: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcc66-ac2b-aa83-7d57-000000001010] 30582 1726855336.29403: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001010 30582 1726855336.29446: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001010 30582 1726855336.29451: WORKER PROCESS EXITING 30582 1726855336.29481: no more pending results, returning what we have 30582 1726855336.29489: in VariableManager get_vars() 30582 1726855336.29539: Calling all_inventory to load vars for managed_node3 30582 1726855336.29543: Calling groups_inventory to load vars for managed_node3 30582 1726855336.29547: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855336.29566: Calling all_plugins_play to load vars for managed_node3 30582 1726855336.29573: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855336.29576: Calling groups_plugins_play to load vars for managed_node3 30582 1726855336.31557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855336.33525: done with get_vars() 30582 1726855336.33553: variable 'ansible_search_path' from source: unknown 30582 1726855336.33554: variable 'ansible_search_path' from source: unknown 30582 1726855336.33598: we have included files to process 30582 1726855336.33599: generating all_blocks data 30582 1726855336.33602: done generating all_blocks data 30582 1726855336.33607: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855336.33609: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855336.33611: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855336.33916: done processing included file 30582 1726855336.33919: iterating over new_blocks loaded from include file 30582 1726855336.33920: in VariableManager get_vars() 30582 1726855336.33938: done with get_vars() 30582 1726855336.33940: filtering new block on tags 30582 1726855336.33968: done filtering new block on tags 30582 1726855336.33971: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30582 1726855336.33976: extending task lists for all hosts with included blocks 30582 1726855336.35902: done extending task lists 30582 1726855336.35904: done processing included files 30582 1726855336.35905: results queue empty 30582 1726855336.35905: checking for any_errors_fatal 30582 1726855336.35910: done checking for any_errors_fatal 30582 1726855336.35910: checking for max_fail_percentage 30582 1726855336.35912: done checking for max_fail_percentage 30582 1726855336.35912: checking to see if all hosts have failed and the running result is not ok 30582 1726855336.35913: done checking to see if all hosts have failed 30582 1726855336.35914: getting the remaining hosts for this loop 30582 1726855336.35915: done getting the remaining hosts for this loop 30582 1726855336.35917: getting the next task for host managed_node3 30582 1726855336.35921: done getting next task for host managed_node3 30582 1726855336.35923: ^ task is: TASK: Cleanup profile and device 30582 1726855336.35926: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855336.35929: getting variables 30582 1726855336.35930: in VariableManager get_vars() 30582 1726855336.35942: Calling all_inventory to load vars for managed_node3 30582 1726855336.35944: Calling groups_inventory to load vars for managed_node3 30582 1726855336.35946: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855336.35951: Calling all_plugins_play to load vars for managed_node3 30582 1726855336.35953: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855336.35955: Calling groups_plugins_play to load vars for managed_node3 30582 1726855336.43134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855336.44634: done with get_vars() 30582 1726855336.44666: done getting variables 30582 1726855336.44718: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 14:02:16 -0400 (0:00:00.174) 0:01:12.797 ****** 30582 1726855336.44750: entering _queue_task() for managed_node3/shell 30582 1726855336.45297: worker is 1 (out of 1 available) 30582 1726855336.45314: exiting _queue_task() for managed_node3/shell 30582 1726855336.45328: done queuing things up, now waiting for results queue to drain 30582 1726855336.45331: waiting for pending results... 30582 1726855336.45709: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30582 1726855336.45828: in run() - task 0affcc66-ac2b-aa83-7d57-0000000016ad 30582 1726855336.45832: variable 'ansible_search_path' from source: unknown 30582 1726855336.45835: variable 'ansible_search_path' from source: unknown 30582 1726855336.45848: calling self._execute() 30582 1726855336.45951: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855336.45964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855336.45994: variable 'omit' from source: magic vars 30582 1726855336.46382: variable 'ansible_distribution_major_version' from source: facts 30582 1726855336.46493: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855336.46496: variable 'omit' from source: magic vars 30582 1726855336.46504: variable 'omit' from source: magic vars 30582 1726855336.46684: variable 'interface' from source: play vars 30582 1726855336.46712: variable 'omit' from source: magic vars 30582 1726855336.46760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855336.46802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855336.46828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855336.46850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855336.46873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855336.46909: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855336.46970: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855336.46973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855336.47042: Set connection var ansible_timeout to 10 30582 1726855336.47049: Set connection var ansible_connection to ssh 30582 1726855336.47063: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855336.47077: Set connection var ansible_pipelining to False 30582 1726855336.47091: Set connection var ansible_shell_executable to /bin/sh 30582 1726855336.47099: Set connection var ansible_shell_type to sh 30582 1726855336.47124: variable 'ansible_shell_executable' from source: unknown 30582 1726855336.47132: variable 'ansible_connection' from source: unknown 30582 1726855336.47140: variable 'ansible_module_compression' from source: unknown 30582 1726855336.47186: variable 'ansible_shell_type' from source: unknown 30582 1726855336.47191: variable 'ansible_shell_executable' from source: unknown 30582 1726855336.47194: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855336.47196: variable 'ansible_pipelining' from source: unknown 30582 1726855336.47199: variable 'ansible_timeout' from source: unknown 30582 1726855336.47202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855336.47323: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855336.47342: variable 'omit' from source: magic vars 30582 1726855336.47353: starting attempt loop 30582 1726855336.47402: running the handler 30582 1726855336.47406: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855336.47410: _low_level_execute_command(): starting 30582 1726855336.47419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855336.48137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855336.48153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855336.48173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855336.48198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855336.48215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855336.48228: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855336.48289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855336.48338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855336.48359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855336.48384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855336.48479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855336.50194: stdout chunk (state=3): >>>/root <<< 30582 1726855336.50353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855336.50356: stdout chunk (state=3): >>><<< 30582 1726855336.50359: stderr chunk (state=3): >>><<< 30582 1726855336.50389: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855336.50486: _low_level_execute_command(): starting 30582 1726855336.50493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249 `" && echo ansible-tmp-1726855336.503966-33978-151764870002249="` echo /root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249 `" ) && sleep 0' 30582 1726855336.51063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855336.51084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855336.51099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855336.51122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855336.51213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855336.53118: stdout chunk (state=3): >>>ansible-tmp-1726855336.503966-33978-151764870002249=/root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249 <<< 30582 1726855336.53220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855336.53299: stderr chunk (state=3): >>><<< 30582 1726855336.53303: stdout chunk (state=3): >>><<< 30582 1726855336.53306: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855336.503966-33978-151764870002249=/root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855336.53420: variable 'ansible_module_compression' from source: unknown 30582 1726855336.53423: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855336.53425: variable 'ansible_facts' from source: unknown 30582 1726855336.53506: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/AnsiballZ_command.py 30582 1726855336.53736: Sending initial data 30582 1726855336.53740: Sent initial data (155 bytes) 30582 1726855336.54256: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855336.54266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855336.54281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855336.54383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855336.54398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855336.54490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855336.56024: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855336.56044: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30582 1726855336.56064: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30582 1726855336.56092: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855336.56192: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855336.56273: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpsqw6mzja /root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/AnsiballZ_command.py <<< 30582 1726855336.56286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/AnsiballZ_command.py" <<< 30582 1726855336.56316: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30582 1726855336.56337: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpsqw6mzja" to remote "/root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/AnsiballZ_command.py" <<< 30582 1726855336.57280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855336.57284: stdout chunk (state=3): >>><<< 30582 1726855336.57291: stderr chunk (state=3): >>><<< 30582 1726855336.57377: done transferring module to remote 30582 1726855336.57382: _low_level_execute_command(): starting 30582 1726855336.57392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/ /root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/AnsiballZ_command.py && sleep 0' 30582 1726855336.58235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855336.58262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855336.58296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855336.58310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855336.58412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855336.60158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855336.60183: stderr chunk (state=3): >>><<< 30582 1726855336.60192: stdout chunk (state=3): >>><<< 30582 1726855336.60209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855336.60214: _low_level_execute_command(): starting 30582 1726855336.60221: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/AnsiballZ_command.py && sleep 0' 30582 1726855336.60633: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855336.60647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855336.60662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855336.60714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855336.60720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855336.60790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855336.81425: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (2e08db44-6b45-462b-a24b-1e1d0b41e5c0) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:02:16.758321", "end": "2024-09-20 14:02:16.812897", "delta": "0:00:00.054576", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855336.83655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855336.83659: stdout chunk (state=3): >>><<< 30582 1726855336.83895: stderr chunk (state=3): >>><<< 30582 1726855336.83900: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (2e08db44-6b45-462b-a24b-1e1d0b41e5c0) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:02:16.758321", "end": "2024-09-20 14:02:16.812897", "delta": "0:00:00.054576", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855336.83904: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855336.83911: _low_level_execute_command(): starting 30582 1726855336.83914: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855336.503966-33978-151764870002249/ > /dev/null 2>&1 && sleep 0' 30582 1726855336.84440: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855336.84461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855336.84474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855336.84488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855336.84508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855336.84566: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855336.84620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855336.84634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855336.84698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855336.84759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855336.86701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855336.86705: stdout chunk (state=3): >>><<< 30582 1726855336.86708: stderr chunk (state=3): >>><<< 30582 1726855336.86894: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855336.86898: handler run complete 30582 1726855336.86901: Evaluated conditional (False): False 30582 1726855336.86903: attempt loop complete, returning result 30582 1726855336.86905: _execute() done 30582 1726855336.86907: dumping result to json 30582 1726855336.86909: done dumping result, returning 30582 1726855336.86911: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcc66-ac2b-aa83-7d57-0000000016ad] 30582 1726855336.86913: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000016ad 30582 1726855336.86989: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000016ad 30582 1726855336.86993: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.054576", "end": "2024-09-20 14:02:16.812897", "rc": 0, "start": "2024-09-20 14:02:16.758321" } STDOUT: Connection 'statebr' (2e08db44-6b45-462b-a24b-1e1d0b41e5c0) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 30582 1726855336.87083: no more pending results, returning what we have 30582 1726855336.87089: results queue empty 30582 1726855336.87090: checking for any_errors_fatal 30582 1726855336.87092: done checking for any_errors_fatal 30582 1726855336.87092: checking for max_fail_percentage 30582 1726855336.87095: done checking for max_fail_percentage 30582 1726855336.87096: checking to see if all hosts have failed and the running result is not ok 30582 1726855336.87096: done checking to see if all hosts have failed 30582 1726855336.87097: getting the remaining hosts for this loop 30582 1726855336.87099: done getting the remaining hosts for this loop 30582 1726855336.87103: getting the next task for host managed_node3 30582 1726855336.87114: done getting next task for host managed_node3 30582 1726855336.87119: ^ task is: TASK: Include the task 'run_test.yml' 30582 1726855336.87122: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855336.87127: getting variables 30582 1726855336.87129: in VariableManager get_vars() 30582 1726855336.87170: Calling all_inventory to load vars for managed_node3 30582 1726855336.87174: Calling groups_inventory to load vars for managed_node3 30582 1726855336.87177: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855336.87294: Calling all_plugins_play to load vars for managed_node3 30582 1726855336.87307: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855336.87312: Calling groups_plugins_play to load vars for managed_node3 30582 1726855336.89214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855336.91041: done with get_vars() 30582 1726855336.91071: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:102 Friday 20 September 2024 14:02:16 -0400 (0:00:00.464) 0:01:13.261 ****** 30582 1726855336.91190: entering _queue_task() for managed_node3/include_tasks 30582 1726855336.91697: worker is 1 (out of 1 available) 30582 1726855336.91710: exiting _queue_task() for managed_node3/include_tasks 30582 1726855336.91722: done queuing things up, now waiting for results queue to drain 30582 1726855336.91723: waiting for pending results... 30582 1726855336.92215: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30582 1726855336.92220: in run() - task 0affcc66-ac2b-aa83-7d57-000000000015 30582 1726855336.92223: variable 'ansible_search_path' from source: unknown 30582 1726855336.92226: calling self._execute() 30582 1726855336.92331: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855336.92344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855336.92361: variable 'omit' from source: magic vars 30582 1726855336.92839: variable 'ansible_distribution_major_version' from source: facts 30582 1726855336.92878: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855336.92900: _execute() done 30582 1726855336.92910: dumping result to json 30582 1726855336.92919: done dumping result, returning 30582 1726855336.92939: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcc66-ac2b-aa83-7d57-000000000015] 30582 1726855336.92960: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000015 30582 1726855336.93209: no more pending results, returning what we have 30582 1726855336.93216: in VariableManager get_vars() 30582 1726855336.93264: Calling all_inventory to load vars for managed_node3 30582 1726855336.93270: Calling groups_inventory to load vars for managed_node3 30582 1726855336.93274: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855336.93292: Calling all_plugins_play to load vars for managed_node3 30582 1726855336.93296: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855336.93300: Calling groups_plugins_play to load vars for managed_node3 30582 1726855336.93895: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000015 30582 1726855336.93899: WORKER PROCESS EXITING 30582 1726855336.96184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855336.99036: done with get_vars() 30582 1726855336.99068: variable 'ansible_search_path' from source: unknown 30582 1726855336.99086: we have included files to process 30582 1726855336.99089: generating all_blocks data 30582 1726855336.99092: done generating all_blocks data 30582 1726855336.99101: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855336.99102: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855336.99105: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855336.99490: in VariableManager get_vars() 30582 1726855336.99510: done with get_vars() 30582 1726855336.99549: in VariableManager get_vars() 30582 1726855336.99567: done with get_vars() 30582 1726855336.99608: in VariableManager get_vars() 30582 1726855336.99624: done with get_vars() 30582 1726855336.99662: in VariableManager get_vars() 30582 1726855336.99678: done with get_vars() 30582 1726855336.99720: in VariableManager get_vars() 30582 1726855336.99737: done with get_vars() 30582 1726855337.00126: in VariableManager get_vars() 30582 1726855337.00142: done with get_vars() 30582 1726855337.00154: done processing included file 30582 1726855337.00155: iterating over new_blocks loaded from include file 30582 1726855337.00156: in VariableManager get_vars() 30582 1726855337.00167: done with get_vars() 30582 1726855337.00168: filtering new block on tags 30582 1726855337.00259: done filtering new block on tags 30582 1726855337.00262: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30582 1726855337.00267: extending task lists for all hosts with included blocks 30582 1726855337.00302: done extending task lists 30582 1726855337.00303: done processing included files 30582 1726855337.00304: results queue empty 30582 1726855337.00305: checking for any_errors_fatal 30582 1726855337.00309: done checking for any_errors_fatal 30582 1726855337.00309: checking for max_fail_percentage 30582 1726855337.00311: done checking for max_fail_percentage 30582 1726855337.00311: checking to see if all hosts have failed and the running result is not ok 30582 1726855337.00312: done checking to see if all hosts have failed 30582 1726855337.00313: getting the remaining hosts for this loop 30582 1726855337.00314: done getting the remaining hosts for this loop 30582 1726855337.00316: getting the next task for host managed_node3 30582 1726855337.00320: done getting next task for host managed_node3 30582 1726855337.00322: ^ task is: TASK: TEST: {{ lsr_description }} 30582 1726855337.00324: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855337.00326: getting variables 30582 1726855337.00327: in VariableManager get_vars() 30582 1726855337.00336: Calling all_inventory to load vars for managed_node3 30582 1726855337.00338: Calling groups_inventory to load vars for managed_node3 30582 1726855337.00340: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.00345: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.00347: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.00350: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.01560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.03170: done with get_vars() 30582 1726855337.03204: done getting variables 30582 1726855337.03251: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855337.03369: variable 'lsr_description' from source: include params TASK [TEST: I can take a profile down that is absent] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 14:02:17 -0400 (0:00:00.122) 0:01:13.383 ****** 30582 1726855337.03401: entering _queue_task() for managed_node3/debug 30582 1726855337.03778: worker is 1 (out of 1 available) 30582 1726855337.03794: exiting _queue_task() for managed_node3/debug 30582 1726855337.03806: done queuing things up, now waiting for results queue to drain 30582 1726855337.03808: waiting for pending results... 30582 1726855337.04111: running TaskExecutor() for managed_node3/TASK: TEST: I can take a profile down that is absent 30582 1726855337.04314: in run() - task 0affcc66-ac2b-aa83-7d57-000000001744 30582 1726855337.04319: variable 'ansible_search_path' from source: unknown 30582 1726855337.04321: variable 'ansible_search_path' from source: unknown 30582 1726855337.04325: calling self._execute() 30582 1726855337.04384: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.04397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.04419: variable 'omit' from source: magic vars 30582 1726855337.04808: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.04826: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.04838: variable 'omit' from source: magic vars 30582 1726855337.04886: variable 'omit' from source: magic vars 30582 1726855337.04984: variable 'lsr_description' from source: include params 30582 1726855337.05009: variable 'omit' from source: magic vars 30582 1726855337.05049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855337.05098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.05123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855337.05141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.05155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.05295: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.05298: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.05300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.05401: Set connection var ansible_timeout to 10 30582 1726855337.05405: Set connection var ansible_connection to ssh 30582 1726855337.05407: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.05493: Set connection var ansible_pipelining to False 30582 1726855337.05497: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.05501: Set connection var ansible_shell_type to sh 30582 1726855337.05503: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.05507: variable 'ansible_connection' from source: unknown 30582 1726855337.05510: variable 'ansible_module_compression' from source: unknown 30582 1726855337.05512: variable 'ansible_shell_type' from source: unknown 30582 1726855337.05514: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.05516: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.05518: variable 'ansible_pipelining' from source: unknown 30582 1726855337.05519: variable 'ansible_timeout' from source: unknown 30582 1726855337.05521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.05652: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.05657: variable 'omit' from source: magic vars 30582 1726855337.05660: starting attempt loop 30582 1726855337.05663: running the handler 30582 1726855337.05759: handler run complete 30582 1726855337.05763: attempt loop complete, returning result 30582 1726855337.05765: _execute() done 30582 1726855337.05770: dumping result to json 30582 1726855337.05772: done dumping result, returning 30582 1726855337.05793: done running TaskExecutor() for managed_node3/TASK: TEST: I can take a profile down that is absent [0affcc66-ac2b-aa83-7d57-000000001744] 30582 1726855337.05796: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001744 ok: [managed_node3] => {} MSG: ########## I can take a profile down that is absent ########## 30582 1726855337.05919: no more pending results, returning what we have 30582 1726855337.05923: results queue empty 30582 1726855337.05925: checking for any_errors_fatal 30582 1726855337.05927: done checking for any_errors_fatal 30582 1726855337.05928: checking for max_fail_percentage 30582 1726855337.05930: done checking for max_fail_percentage 30582 1726855337.05931: checking to see if all hosts have failed and the running result is not ok 30582 1726855337.05932: done checking to see if all hosts have failed 30582 1726855337.05933: getting the remaining hosts for this loop 30582 1726855337.05935: done getting the remaining hosts for this loop 30582 1726855337.05939: getting the next task for host managed_node3 30582 1726855337.05949: done getting next task for host managed_node3 30582 1726855337.05952: ^ task is: TASK: Show item 30582 1726855337.05955: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855337.05960: getting variables 30582 1726855337.05962: in VariableManager get_vars() 30582 1726855337.06007: Calling all_inventory to load vars for managed_node3 30582 1726855337.06011: Calling groups_inventory to load vars for managed_node3 30582 1726855337.06015: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.06028: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.06032: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.06037: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.06606: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001744 30582 1726855337.06627: WORKER PROCESS EXITING 30582 1726855337.07872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.09569: done with get_vars() 30582 1726855337.09603: done getting variables 30582 1726855337.09672: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 14:02:17 -0400 (0:00:00.063) 0:01:13.446 ****** 30582 1726855337.09705: entering _queue_task() for managed_node3/debug 30582 1726855337.10206: worker is 1 (out of 1 available) 30582 1726855337.10218: exiting _queue_task() for managed_node3/debug 30582 1726855337.10227: done queuing things up, now waiting for results queue to drain 30582 1726855337.10229: waiting for pending results... 30582 1726855337.10476: running TaskExecutor() for managed_node3/TASK: Show item 30582 1726855337.10580: in run() - task 0affcc66-ac2b-aa83-7d57-000000001745 30582 1726855337.10606: variable 'ansible_search_path' from source: unknown 30582 1726855337.10616: variable 'ansible_search_path' from source: unknown 30582 1726855337.10698: variable 'omit' from source: magic vars 30582 1726855337.10904: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.10918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.10934: variable 'omit' from source: magic vars 30582 1726855337.11441: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.11445: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.11448: variable 'omit' from source: magic vars 30582 1726855337.11484: variable 'omit' from source: magic vars 30582 1726855337.11537: variable 'item' from source: unknown 30582 1726855337.11627: variable 'item' from source: unknown 30582 1726855337.11679: variable 'omit' from source: magic vars 30582 1726855337.11766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855337.11882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.11916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855337.12008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.12011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.12014: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.12017: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.12019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.12121: Set connection var ansible_timeout to 10 30582 1726855337.12132: Set connection var ansible_connection to ssh 30582 1726855337.12241: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.12244: Set connection var ansible_pipelining to False 30582 1726855337.12246: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.12247: Set connection var ansible_shell_type to sh 30582 1726855337.12249: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.12251: variable 'ansible_connection' from source: unknown 30582 1726855337.12253: variable 'ansible_module_compression' from source: unknown 30582 1726855337.12254: variable 'ansible_shell_type' from source: unknown 30582 1726855337.12256: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.12258: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.12259: variable 'ansible_pipelining' from source: unknown 30582 1726855337.12261: variable 'ansible_timeout' from source: unknown 30582 1726855337.12263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.12352: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.12376: variable 'omit' from source: magic vars 30582 1726855337.12391: starting attempt loop 30582 1726855337.12398: running the handler 30582 1726855337.12447: variable 'lsr_description' from source: include params 30582 1726855337.12596: variable 'lsr_description' from source: include params 30582 1726855337.12600: handler run complete 30582 1726855337.12602: attempt loop complete, returning result 30582 1726855337.12604: variable 'item' from source: unknown 30582 1726855337.12653: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can take a profile down that is absent" } 30582 1726855337.13197: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.13200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.13202: variable 'omit' from source: magic vars 30582 1726855337.13205: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.13207: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.13209: variable 'omit' from source: magic vars 30582 1726855337.13211: variable 'omit' from source: magic vars 30582 1726855337.13213: variable 'item' from source: unknown 30582 1726855337.13234: variable 'item' from source: unknown 30582 1726855337.13252: variable 'omit' from source: magic vars 30582 1726855337.13278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.13293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.13311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.13326: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.13333: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.13339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.13427: Set connection var ansible_timeout to 10 30582 1726855337.13434: Set connection var ansible_connection to ssh 30582 1726855337.13445: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.13453: Set connection var ansible_pipelining to False 30582 1726855337.13463: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.13471: Set connection var ansible_shell_type to sh 30582 1726855337.13495: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.13502: variable 'ansible_connection' from source: unknown 30582 1726855337.13509: variable 'ansible_module_compression' from source: unknown 30582 1726855337.13521: variable 'ansible_shell_type' from source: unknown 30582 1726855337.13528: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.13535: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.13543: variable 'ansible_pipelining' from source: unknown 30582 1726855337.13549: variable 'ansible_timeout' from source: unknown 30582 1726855337.13556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.13739: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.13743: variable 'omit' from source: magic vars 30582 1726855337.13745: starting attempt loop 30582 1726855337.13747: running the handler 30582 1726855337.13749: variable 'lsr_setup' from source: include params 30582 1726855337.13781: variable 'lsr_setup' from source: include params 30582 1726855337.13832: handler run complete 30582 1726855337.13861: attempt loop complete, returning result 30582 1726855337.13881: variable 'item' from source: unknown 30582 1726855337.13943: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove_profile.yml" ] } 30582 1726855337.14178: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.14182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.14184: variable 'omit' from source: magic vars 30582 1726855337.14396: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.14400: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.14402: variable 'omit' from source: magic vars 30582 1726855337.14404: variable 'omit' from source: magic vars 30582 1726855337.14407: variable 'item' from source: unknown 30582 1726855337.14453: variable 'item' from source: unknown 30582 1726855337.14473: variable 'omit' from source: magic vars 30582 1726855337.14501: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.14515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.14524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.14538: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.14544: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.14551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.14635: Set connection var ansible_timeout to 10 30582 1726855337.14642: Set connection var ansible_connection to ssh 30582 1726855337.14653: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.14664: Set connection var ansible_pipelining to False 30582 1726855337.14677: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.14683: Set connection var ansible_shell_type to sh 30582 1726855337.14705: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.14712: variable 'ansible_connection' from source: unknown 30582 1726855337.14830: variable 'ansible_module_compression' from source: unknown 30582 1726855337.14832: variable 'ansible_shell_type' from source: unknown 30582 1726855337.14835: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.14837: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.14838: variable 'ansible_pipelining' from source: unknown 30582 1726855337.14840: variable 'ansible_timeout' from source: unknown 30582 1726855337.14842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.14844: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.14853: variable 'omit' from source: magic vars 30582 1726855337.14861: starting attempt loop 30582 1726855337.14867: running the handler 30582 1726855337.14893: variable 'lsr_test' from source: include params 30582 1726855337.14964: variable 'lsr_test' from source: include params 30582 1726855337.14989: handler run complete 30582 1726855337.15046: attempt loop complete, returning result 30582 1726855337.15049: variable 'item' from source: unknown 30582 1726855337.15097: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30582 1726855337.15269: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.15279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.15294: variable 'omit' from source: magic vars 30582 1726855337.15486: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.15491: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.15493: variable 'omit' from source: magic vars 30582 1726855337.15495: variable 'omit' from source: magic vars 30582 1726855337.15505: variable 'item' from source: unknown 30582 1726855337.15559: variable 'item' from source: unknown 30582 1726855337.15582: variable 'omit' from source: magic vars 30582 1726855337.15615: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.15628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.15639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.15653: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.15660: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.15704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.15758: Set connection var ansible_timeout to 10 30582 1726855337.15769: Set connection var ansible_connection to ssh 30582 1726855337.15783: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.15795: Set connection var ansible_pipelining to False 30582 1726855337.15892: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.15896: Set connection var ansible_shell_type to sh 30582 1726855337.15898: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.15901: variable 'ansible_connection' from source: unknown 30582 1726855337.15902: variable 'ansible_module_compression' from source: unknown 30582 1726855337.15904: variable 'ansible_shell_type' from source: unknown 30582 1726855337.15906: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.15908: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.15910: variable 'ansible_pipelining' from source: unknown 30582 1726855337.15912: variable 'ansible_timeout' from source: unknown 30582 1726855337.15916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.16033: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.16036: variable 'omit' from source: magic vars 30582 1726855337.16038: starting attempt loop 30582 1726855337.16041: running the handler 30582 1726855337.16062: variable 'lsr_assert' from source: include params 30582 1726855337.16252: variable 'lsr_assert' from source: include params 30582 1726855337.16256: handler run complete 30582 1726855337.16259: attempt loop complete, returning result 30582 1726855337.16261: variable 'item' from source: unknown 30582 1726855337.16300: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml" ] } 30582 1726855337.16474: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.16484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.16500: variable 'omit' from source: magic vars 30582 1726855337.17123: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.17136: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.17151: variable 'omit' from source: magic vars 30582 1726855337.17233: variable 'omit' from source: magic vars 30582 1726855337.17236: variable 'item' from source: unknown 30582 1726855337.17290: variable 'item' from source: unknown 30582 1726855337.17310: variable 'omit' from source: magic vars 30582 1726855337.17342: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.17357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.17370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.17386: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.17397: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.17405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.17496: Set connection var ansible_timeout to 10 30582 1726855337.17559: Set connection var ansible_connection to ssh 30582 1726855337.17562: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.17564: Set connection var ansible_pipelining to False 30582 1726855337.17566: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.17571: Set connection var ansible_shell_type to sh 30582 1726855337.17573: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.17575: variable 'ansible_connection' from source: unknown 30582 1726855337.17577: variable 'ansible_module_compression' from source: unknown 30582 1726855337.17578: variable 'ansible_shell_type' from source: unknown 30582 1726855337.17580: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.17582: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.17590: variable 'ansible_pipelining' from source: unknown 30582 1726855337.17599: variable 'ansible_timeout' from source: unknown 30582 1726855337.17607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.17711: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.17779: variable 'omit' from source: magic vars 30582 1726855337.17782: starting attempt loop 30582 1726855337.17785: running the handler 30582 1726855337.17789: variable 'lsr_assert_when' from source: include params 30582 1726855337.17832: variable 'lsr_assert_when' from source: include params 30582 1726855337.17931: variable 'network_provider' from source: set_fact 30582 1726855337.17966: handler run complete 30582 1726855337.17986: attempt loop complete, returning result 30582 1726855337.18012: variable 'item' from source: unknown 30582 1726855337.18070: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30582 1726855337.18263: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.18266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.18270: variable 'omit' from source: magic vars 30582 1726855337.18395: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.18404: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.18483: variable 'omit' from source: magic vars 30582 1726855337.18488: variable 'omit' from source: magic vars 30582 1726855337.18491: variable 'item' from source: unknown 30582 1726855337.18529: variable 'item' from source: unknown 30582 1726855337.18545: variable 'omit' from source: magic vars 30582 1726855337.18563: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.18575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.18589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.18607: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.18613: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.18620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.18708: Set connection var ansible_timeout to 10 30582 1726855337.18716: Set connection var ansible_connection to ssh 30582 1726855337.18730: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.18739: Set connection var ansible_pipelining to False 30582 1726855337.18748: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.18754: Set connection var ansible_shell_type to sh 30582 1726855337.18802: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.18805: variable 'ansible_connection' from source: unknown 30582 1726855337.18813: variable 'ansible_module_compression' from source: unknown 30582 1726855337.18815: variable 'ansible_shell_type' from source: unknown 30582 1726855337.18817: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.18819: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.18820: variable 'ansible_pipelining' from source: unknown 30582 1726855337.18822: variable 'ansible_timeout' from source: unknown 30582 1726855337.18912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.18939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.18952: variable 'omit' from source: magic vars 30582 1726855337.18962: starting attempt loop 30582 1726855337.18971: running the handler 30582 1726855337.19007: variable 'lsr_fail_debug' from source: play vars 30582 1726855337.19086: variable 'lsr_fail_debug' from source: play vars 30582 1726855337.19117: handler run complete 30582 1726855337.19151: attempt loop complete, returning result 30582 1726855337.19178: variable 'item' from source: unknown 30582 1726855337.19251: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30582 1726855337.19640: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.19644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.19646: variable 'omit' from source: magic vars 30582 1726855337.19648: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.19650: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.19652: variable 'omit' from source: magic vars 30582 1726855337.19654: variable 'omit' from source: magic vars 30582 1726855337.19664: variable 'item' from source: unknown 30582 1726855337.19738: variable 'item' from source: unknown 30582 1726855337.19751: variable 'omit' from source: magic vars 30582 1726855337.19768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.19857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.19860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.19863: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.19865: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.19867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.19900: Set connection var ansible_timeout to 10 30582 1726855337.19903: Set connection var ansible_connection to ssh 30582 1726855337.19910: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.19915: Set connection var ansible_pipelining to False 30582 1726855337.19920: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.19923: Set connection var ansible_shell_type to sh 30582 1726855337.19948: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.19961: variable 'ansible_connection' from source: unknown 30582 1726855337.19965: variable 'ansible_module_compression' from source: unknown 30582 1726855337.19967: variable 'ansible_shell_type' from source: unknown 30582 1726855337.19969: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.19971: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.19974: variable 'ansible_pipelining' from source: unknown 30582 1726855337.19976: variable 'ansible_timeout' from source: unknown 30582 1726855337.19981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.20182: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.20186: variable 'omit' from source: magic vars 30582 1726855337.20190: starting attempt loop 30582 1726855337.20192: running the handler 30582 1726855337.20194: variable 'lsr_cleanup' from source: include params 30582 1726855337.20196: variable 'lsr_cleanup' from source: include params 30582 1726855337.20200: handler run complete 30582 1726855337.20214: attempt loop complete, returning result 30582 1726855337.20293: variable 'item' from source: unknown 30582 1726855337.20301: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30582 1726855337.20391: dumping result to json 30582 1726855337.20399: done dumping result, returning 30582 1726855337.20403: done running TaskExecutor() for managed_node3/TASK: Show item [0affcc66-ac2b-aa83-7d57-000000001745] 30582 1726855337.20406: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001745 30582 1726855337.20449: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001745 30582 1726855337.20512: no more pending results, returning what we have 30582 1726855337.20516: results queue empty 30582 1726855337.20517: checking for any_errors_fatal 30582 1726855337.20523: done checking for any_errors_fatal 30582 1726855337.20524: checking for max_fail_percentage 30582 1726855337.20526: done checking for max_fail_percentage 30582 1726855337.20527: checking to see if all hosts have failed and the running result is not ok 30582 1726855337.20528: done checking to see if all hosts have failed 30582 1726855337.20529: getting the remaining hosts for this loop 30582 1726855337.20530: done getting the remaining hosts for this loop 30582 1726855337.20534: getting the next task for host managed_node3 30582 1726855337.20542: done getting next task for host managed_node3 30582 1726855337.20544: ^ task is: TASK: Include the task 'show_interfaces.yml' 30582 1726855337.20547: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855337.20552: getting variables 30582 1726855337.20554: in VariableManager get_vars() 30582 1726855337.20598: Calling all_inventory to load vars for managed_node3 30582 1726855337.20601: Calling groups_inventory to load vars for managed_node3 30582 1726855337.20605: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.20619: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.20623: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.20627: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.21493: WORKER PROCESS EXITING 30582 1726855337.22546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.24190: done with get_vars() 30582 1726855337.24222: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 14:02:17 -0400 (0:00:00.146) 0:01:13.593 ****** 30582 1726855337.24328: entering _queue_task() for managed_node3/include_tasks 30582 1726855337.24892: worker is 1 (out of 1 available) 30582 1726855337.24901: exiting _queue_task() for managed_node3/include_tasks 30582 1726855337.24912: done queuing things up, now waiting for results queue to drain 30582 1726855337.24914: waiting for pending results... 30582 1726855337.25046: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30582 1726855337.25253: in run() - task 0affcc66-ac2b-aa83-7d57-000000001746 30582 1726855337.25257: variable 'ansible_search_path' from source: unknown 30582 1726855337.25261: variable 'ansible_search_path' from source: unknown 30582 1726855337.25264: calling self._execute() 30582 1726855337.25343: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.25361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.25380: variable 'omit' from source: magic vars 30582 1726855337.25794: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.25816: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.25827: _execute() done 30582 1726855337.25834: dumping result to json 30582 1726855337.25903: done dumping result, returning 30582 1726855337.25908: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcc66-ac2b-aa83-7d57-000000001746] 30582 1726855337.25910: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001746 30582 1726855337.25979: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001746 30582 1726855337.25981: WORKER PROCESS EXITING 30582 1726855337.26012: no more pending results, returning what we have 30582 1726855337.26017: in VariableManager get_vars() 30582 1726855337.26064: Calling all_inventory to load vars for managed_node3 30582 1726855337.26067: Calling groups_inventory to load vars for managed_node3 30582 1726855337.26070: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.26084: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.26090: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.26093: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.27915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.29222: done with get_vars() 30582 1726855337.29241: variable 'ansible_search_path' from source: unknown 30582 1726855337.29242: variable 'ansible_search_path' from source: unknown 30582 1726855337.29271: we have included files to process 30582 1726855337.29272: generating all_blocks data 30582 1726855337.29273: done generating all_blocks data 30582 1726855337.29276: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855337.29277: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855337.29278: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855337.29353: in VariableManager get_vars() 30582 1726855337.29368: done with get_vars() 30582 1726855337.29448: done processing included file 30582 1726855337.29450: iterating over new_blocks loaded from include file 30582 1726855337.29451: in VariableManager get_vars() 30582 1726855337.29463: done with get_vars() 30582 1726855337.29464: filtering new block on tags 30582 1726855337.29486: done filtering new block on tags 30582 1726855337.29490: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30582 1726855337.29494: extending task lists for all hosts with included blocks 30582 1726855337.29756: done extending task lists 30582 1726855337.29757: done processing included files 30582 1726855337.29758: results queue empty 30582 1726855337.29758: checking for any_errors_fatal 30582 1726855337.29763: done checking for any_errors_fatal 30582 1726855337.29764: checking for max_fail_percentage 30582 1726855337.29765: done checking for max_fail_percentage 30582 1726855337.29765: checking to see if all hosts have failed and the running result is not ok 30582 1726855337.29766: done checking to see if all hosts have failed 30582 1726855337.29766: getting the remaining hosts for this loop 30582 1726855337.29767: done getting the remaining hosts for this loop 30582 1726855337.29769: getting the next task for host managed_node3 30582 1726855337.29773: done getting next task for host managed_node3 30582 1726855337.29775: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30582 1726855337.29777: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855337.29779: getting variables 30582 1726855337.29779: in VariableManager get_vars() 30582 1726855337.29788: Calling all_inventory to load vars for managed_node3 30582 1726855337.29790: Calling groups_inventory to load vars for managed_node3 30582 1726855337.29792: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.29796: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.29797: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.29799: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.30467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.31818: done with get_vars() 30582 1726855337.31835: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 14:02:17 -0400 (0:00:00.075) 0:01:13.668 ****** 30582 1726855337.31896: entering _queue_task() for managed_node3/include_tasks 30582 1726855337.32162: worker is 1 (out of 1 available) 30582 1726855337.32179: exiting _queue_task() for managed_node3/include_tasks 30582 1726855337.32192: done queuing things up, now waiting for results queue to drain 30582 1726855337.32194: waiting for pending results... 30582 1726855337.32377: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30582 1726855337.32455: in run() - task 0affcc66-ac2b-aa83-7d57-00000000176d 30582 1726855337.32466: variable 'ansible_search_path' from source: unknown 30582 1726855337.32473: variable 'ansible_search_path' from source: unknown 30582 1726855337.32500: calling self._execute() 30582 1726855337.32571: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.32575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.32581: variable 'omit' from source: magic vars 30582 1726855337.32858: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.32871: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.32874: _execute() done 30582 1726855337.32877: dumping result to json 30582 1726855337.32880: done dumping result, returning 30582 1726855337.32890: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcc66-ac2b-aa83-7d57-00000000176d] 30582 1726855337.32895: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000176d 30582 1726855337.32981: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000176d 30582 1726855337.32983: WORKER PROCESS EXITING 30582 1726855337.33011: no more pending results, returning what we have 30582 1726855337.33016: in VariableManager get_vars() 30582 1726855337.33059: Calling all_inventory to load vars for managed_node3 30582 1726855337.33062: Calling groups_inventory to load vars for managed_node3 30582 1726855337.33065: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.33081: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.33085: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.33095: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.34316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.35505: done with get_vars() 30582 1726855337.35522: variable 'ansible_search_path' from source: unknown 30582 1726855337.35523: variable 'ansible_search_path' from source: unknown 30582 1726855337.35551: we have included files to process 30582 1726855337.35552: generating all_blocks data 30582 1726855337.35553: done generating all_blocks data 30582 1726855337.35554: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855337.35554: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855337.35556: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855337.35741: done processing included file 30582 1726855337.35742: iterating over new_blocks loaded from include file 30582 1726855337.35743: in VariableManager get_vars() 30582 1726855337.35756: done with get_vars() 30582 1726855337.35758: filtering new block on tags 30582 1726855337.35784: done filtering new block on tags 30582 1726855337.35786: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30582 1726855337.35791: extending task lists for all hosts with included blocks 30582 1726855337.35886: done extending task lists 30582 1726855337.35888: done processing included files 30582 1726855337.35889: results queue empty 30582 1726855337.35890: checking for any_errors_fatal 30582 1726855337.35892: done checking for any_errors_fatal 30582 1726855337.35892: checking for max_fail_percentage 30582 1726855337.35893: done checking for max_fail_percentage 30582 1726855337.35894: checking to see if all hosts have failed and the running result is not ok 30582 1726855337.35894: done checking to see if all hosts have failed 30582 1726855337.35895: getting the remaining hosts for this loop 30582 1726855337.35896: done getting the remaining hosts for this loop 30582 1726855337.35897: getting the next task for host managed_node3 30582 1726855337.35901: done getting next task for host managed_node3 30582 1726855337.35902: ^ task is: TASK: Gather current interface info 30582 1726855337.35904: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855337.35906: getting variables 30582 1726855337.35906: in VariableManager get_vars() 30582 1726855337.35914: Calling all_inventory to load vars for managed_node3 30582 1726855337.35915: Calling groups_inventory to load vars for managed_node3 30582 1726855337.35917: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.35921: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.35922: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.35924: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.36594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.37935: done with get_vars() 30582 1726855337.37958: done getting variables 30582 1726855337.38005: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 14:02:17 -0400 (0:00:00.061) 0:01:13.730 ****** 30582 1726855337.38038: entering _queue_task() for managed_node3/command 30582 1726855337.38426: worker is 1 (out of 1 available) 30582 1726855337.38440: exiting _queue_task() for managed_node3/command 30582 1726855337.38452: done queuing things up, now waiting for results queue to drain 30582 1726855337.38454: waiting for pending results... 30582 1726855337.38760: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30582 1726855337.38870: in run() - task 0affcc66-ac2b-aa83-7d57-0000000017a8 30582 1726855337.38874: variable 'ansible_search_path' from source: unknown 30582 1726855337.38877: variable 'ansible_search_path' from source: unknown 30582 1726855337.38952: calling self._execute() 30582 1726855337.39002: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.39006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.39016: variable 'omit' from source: magic vars 30582 1726855337.39301: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.39310: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.39316: variable 'omit' from source: magic vars 30582 1726855337.39350: variable 'omit' from source: magic vars 30582 1726855337.39378: variable 'omit' from source: magic vars 30582 1726855337.39412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855337.39438: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.39457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855337.39472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.39483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.39510: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.39514: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.39516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.39591: Set connection var ansible_timeout to 10 30582 1726855337.39594: Set connection var ansible_connection to ssh 30582 1726855337.39604: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.39607: Set connection var ansible_pipelining to False 30582 1726855337.39610: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.39612: Set connection var ansible_shell_type to sh 30582 1726855337.39628: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.39631: variable 'ansible_connection' from source: unknown 30582 1726855337.39634: variable 'ansible_module_compression' from source: unknown 30582 1726855337.39636: variable 'ansible_shell_type' from source: unknown 30582 1726855337.39639: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.39641: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.39643: variable 'ansible_pipelining' from source: unknown 30582 1726855337.39646: variable 'ansible_timeout' from source: unknown 30582 1726855337.39650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.39756: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.39764: variable 'omit' from source: magic vars 30582 1726855337.39772: starting attempt loop 30582 1726855337.39775: running the handler 30582 1726855337.39790: _low_level_execute_command(): starting 30582 1726855337.39797: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855337.40301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855337.40305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855337.40317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.40331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.40389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855337.40392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855337.40463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855337.42166: stdout chunk (state=3): >>>/root <<< 30582 1726855337.42264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855337.42292: stderr chunk (state=3): >>><<< 30582 1726855337.42295: stdout chunk (state=3): >>><<< 30582 1726855337.42316: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855337.42328: _low_level_execute_command(): starting 30582 1726855337.42334: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042 `" && echo ansible-tmp-1726855337.4231572-34015-94543618091042="` echo /root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042 `" ) && sleep 0' 30582 1726855337.42754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855337.42757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.42760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855337.42769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855337.42771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.42817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855337.42820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855337.42889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855337.44778: stdout chunk (state=3): >>>ansible-tmp-1726855337.4231572-34015-94543618091042=/root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042 <<< 30582 1726855337.44883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855337.44912: stderr chunk (state=3): >>><<< 30582 1726855337.44915: stdout chunk (state=3): >>><<< 30582 1726855337.44932: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855337.4231572-34015-94543618091042=/root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855337.44962: variable 'ansible_module_compression' from source: unknown 30582 1726855337.45009: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855337.45040: variable 'ansible_facts' from source: unknown 30582 1726855337.45102: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/AnsiballZ_command.py 30582 1726855337.45205: Sending initial data 30582 1726855337.45209: Sent initial data (155 bytes) 30582 1726855337.45666: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855337.45669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855337.45672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.45675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855337.45676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.45739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855337.45745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855337.45747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855337.45807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855337.47346: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855337.47352: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855337.47402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855337.47460: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxnw3tf6w /root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/AnsiballZ_command.py <<< 30582 1726855337.47464: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/AnsiballZ_command.py" <<< 30582 1726855337.47514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxnw3tf6w" to remote "/root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/AnsiballZ_command.py" <<< 30582 1726855337.47520: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/AnsiballZ_command.py" <<< 30582 1726855337.48105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855337.48146: stderr chunk (state=3): >>><<< 30582 1726855337.48150: stdout chunk (state=3): >>><<< 30582 1726855337.48196: done transferring module to remote 30582 1726855337.48205: _low_level_execute_command(): starting 30582 1726855337.48210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/ /root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/AnsiballZ_command.py && sleep 0' 30582 1726855337.48644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855337.48648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.48659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.48717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855337.48720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855337.48722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855337.48785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855337.50542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855337.50569: stderr chunk (state=3): >>><<< 30582 1726855337.50576: stdout chunk (state=3): >>><<< 30582 1726855337.50596: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855337.50600: _low_level_execute_command(): starting 30582 1726855337.50602: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/AnsiballZ_command.py && sleep 0' 30582 1726855337.51058: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855337.51061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855337.51064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.51066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855337.51068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.51124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855337.51132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855337.51136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855337.51193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855337.66513: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:02:17.660984", "end": "2024-09-20 14:02:17.664265", "delta": "0:00:00.003281", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855337.67990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855337.68020: stderr chunk (state=3): >>><<< 30582 1726855337.68023: stdout chunk (state=3): >>><<< 30582 1726855337.68039: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:02:17.660984", "end": "2024-09-20 14:02:17.664265", "delta": "0:00:00.003281", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855337.68071: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855337.68080: _low_level_execute_command(): starting 30582 1726855337.68084: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855337.4231572-34015-94543618091042/ > /dev/null 2>&1 && sleep 0' 30582 1726855337.68539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855337.68542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.68545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855337.68547: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855337.68555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855337.68606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855337.68609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855337.68612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855337.68677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855337.70499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855337.70521: stderr chunk (state=3): >>><<< 30582 1726855337.70524: stdout chunk (state=3): >>><<< 30582 1726855337.70541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855337.70546: handler run complete 30582 1726855337.70565: Evaluated conditional (False): False 30582 1726855337.70575: attempt loop complete, returning result 30582 1726855337.70577: _execute() done 30582 1726855337.70580: dumping result to json 30582 1726855337.70585: done dumping result, returning 30582 1726855337.70594: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcc66-ac2b-aa83-7d57-0000000017a8] 30582 1726855337.70599: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017a8 30582 1726855337.70700: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017a8 30582 1726855337.70702: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003281", "end": "2024-09-20 14:02:17.664265", "rc": 0, "start": "2024-09-20 14:02:17.660984" } STDOUT: bonding_masters eth0 lo rpltstbr 30582 1726855337.70774: no more pending results, returning what we have 30582 1726855337.70777: results queue empty 30582 1726855337.70778: checking for any_errors_fatal 30582 1726855337.70780: done checking for any_errors_fatal 30582 1726855337.70780: checking for max_fail_percentage 30582 1726855337.70782: done checking for max_fail_percentage 30582 1726855337.70783: checking to see if all hosts have failed and the running result is not ok 30582 1726855337.70784: done checking to see if all hosts have failed 30582 1726855337.70785: getting the remaining hosts for this loop 30582 1726855337.70789: done getting the remaining hosts for this loop 30582 1726855337.70793: getting the next task for host managed_node3 30582 1726855337.70801: done getting next task for host managed_node3 30582 1726855337.70804: ^ task is: TASK: Set current_interfaces 30582 1726855337.70809: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855337.70816: getting variables 30582 1726855337.70817: in VariableManager get_vars() 30582 1726855337.70855: Calling all_inventory to load vars for managed_node3 30582 1726855337.70858: Calling groups_inventory to load vars for managed_node3 30582 1726855337.70862: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.70874: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.70877: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.70880: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.71764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.73289: done with get_vars() 30582 1726855337.73323: done getting variables 30582 1726855337.73391: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 14:02:17 -0400 (0:00:00.353) 0:01:14.084 ****** 30582 1726855337.73428: entering _queue_task() for managed_node3/set_fact 30582 1726855337.73813: worker is 1 (out of 1 available) 30582 1726855337.73825: exiting _queue_task() for managed_node3/set_fact 30582 1726855337.73838: done queuing things up, now waiting for results queue to drain 30582 1726855337.73842: waiting for pending results... 30582 1726855337.74308: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30582 1726855337.74313: in run() - task 0affcc66-ac2b-aa83-7d57-0000000017a9 30582 1726855337.74316: variable 'ansible_search_path' from source: unknown 30582 1726855337.74319: variable 'ansible_search_path' from source: unknown 30582 1726855337.74339: calling self._execute() 30582 1726855337.74442: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.74452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.74469: variable 'omit' from source: magic vars 30582 1726855337.74871: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.74891: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.74946: variable 'omit' from source: magic vars 30582 1726855337.74961: variable 'omit' from source: magic vars 30582 1726855337.75086: variable '_current_interfaces' from source: set_fact 30582 1726855337.75165: variable 'omit' from source: magic vars 30582 1726855337.75216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855337.75258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.75291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855337.75385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.75390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.75392: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.75394: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.75396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.75491: Set connection var ansible_timeout to 10 30582 1726855337.75502: Set connection var ansible_connection to ssh 30582 1726855337.75514: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.75607: Set connection var ansible_pipelining to False 30582 1726855337.75610: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.75612: Set connection var ansible_shell_type to sh 30582 1726855337.75614: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.75617: variable 'ansible_connection' from source: unknown 30582 1726855337.75619: variable 'ansible_module_compression' from source: unknown 30582 1726855337.75620: variable 'ansible_shell_type' from source: unknown 30582 1726855337.75622: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.75624: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.75626: variable 'ansible_pipelining' from source: unknown 30582 1726855337.75627: variable 'ansible_timeout' from source: unknown 30582 1726855337.75629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.75753: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.75774: variable 'omit' from source: magic vars 30582 1726855337.75786: starting attempt loop 30582 1726855337.75796: running the handler 30582 1726855337.75811: handler run complete 30582 1726855337.75830: attempt loop complete, returning result 30582 1726855337.75837: _execute() done 30582 1726855337.75843: dumping result to json 30582 1726855337.75851: done dumping result, returning 30582 1726855337.75864: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcc66-ac2b-aa83-7d57-0000000017a9] 30582 1726855337.75879: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017a9 30582 1726855337.76138: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017a9 30582 1726855337.76141: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30582 1726855337.76212: no more pending results, returning what we have 30582 1726855337.76216: results queue empty 30582 1726855337.76217: checking for any_errors_fatal 30582 1726855337.76228: done checking for any_errors_fatal 30582 1726855337.76229: checking for max_fail_percentage 30582 1726855337.76231: done checking for max_fail_percentage 30582 1726855337.76233: checking to see if all hosts have failed and the running result is not ok 30582 1726855337.76233: done checking to see if all hosts have failed 30582 1726855337.76234: getting the remaining hosts for this loop 30582 1726855337.76236: done getting the remaining hosts for this loop 30582 1726855337.76241: getting the next task for host managed_node3 30582 1726855337.76252: done getting next task for host managed_node3 30582 1726855337.76256: ^ task is: TASK: Show current_interfaces 30582 1726855337.76261: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855337.76269: getting variables 30582 1726855337.76271: in VariableManager get_vars() 30582 1726855337.76320: Calling all_inventory to load vars for managed_node3 30582 1726855337.76324: Calling groups_inventory to load vars for managed_node3 30582 1726855337.76328: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.76342: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.76346: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.76349: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.77961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.79691: done with get_vars() 30582 1726855337.79722: done getting variables 30582 1726855337.79788: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 14:02:17 -0400 (0:00:00.063) 0:01:14.148 ****** 30582 1726855337.79821: entering _queue_task() for managed_node3/debug 30582 1726855337.80198: worker is 1 (out of 1 available) 30582 1726855337.80211: exiting _queue_task() for managed_node3/debug 30582 1726855337.80223: done queuing things up, now waiting for results queue to drain 30582 1726855337.80224: waiting for pending results... 30582 1726855337.80607: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30582 1726855337.80649: in run() - task 0affcc66-ac2b-aa83-7d57-00000000176e 30582 1726855337.80672: variable 'ansible_search_path' from source: unknown 30582 1726855337.80679: variable 'ansible_search_path' from source: unknown 30582 1726855337.80725: calling self._execute() 30582 1726855337.80828: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.80839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.80853: variable 'omit' from source: magic vars 30582 1726855337.81229: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.81249: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.81260: variable 'omit' from source: magic vars 30582 1726855337.81315: variable 'omit' from source: magic vars 30582 1726855337.81422: variable 'current_interfaces' from source: set_fact 30582 1726855337.81458: variable 'omit' from source: magic vars 30582 1726855337.81512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855337.81554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855337.81589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855337.81612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.81631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855337.81669: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855337.81793: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.81795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.81797: Set connection var ansible_timeout to 10 30582 1726855337.81799: Set connection var ansible_connection to ssh 30582 1726855337.81800: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855337.81809: Set connection var ansible_pipelining to False 30582 1726855337.81816: Set connection var ansible_shell_executable to /bin/sh 30582 1726855337.81821: Set connection var ansible_shell_type to sh 30582 1726855337.81844: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.81852: variable 'ansible_connection' from source: unknown 30582 1726855337.81859: variable 'ansible_module_compression' from source: unknown 30582 1726855337.81865: variable 'ansible_shell_type' from source: unknown 30582 1726855337.81873: variable 'ansible_shell_executable' from source: unknown 30582 1726855337.81878: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.81884: variable 'ansible_pipelining' from source: unknown 30582 1726855337.81891: variable 'ansible_timeout' from source: unknown 30582 1726855337.81901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.82059: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855337.82080: variable 'omit' from source: magic vars 30582 1726855337.82094: starting attempt loop 30582 1726855337.82101: running the handler 30582 1726855337.82160: handler run complete 30582 1726855337.82184: attempt loop complete, returning result 30582 1726855337.82198: _execute() done 30582 1726855337.82205: dumping result to json 30582 1726855337.82213: done dumping result, returning 30582 1726855337.82229: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcc66-ac2b-aa83-7d57-00000000176e] 30582 1726855337.82240: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000176e ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30582 1726855337.82405: no more pending results, returning what we have 30582 1726855337.82409: results queue empty 30582 1726855337.82410: checking for any_errors_fatal 30582 1726855337.82418: done checking for any_errors_fatal 30582 1726855337.82419: checking for max_fail_percentage 30582 1726855337.82421: done checking for max_fail_percentage 30582 1726855337.82423: checking to see if all hosts have failed and the running result is not ok 30582 1726855337.82424: done checking to see if all hosts have failed 30582 1726855337.82424: getting the remaining hosts for this loop 30582 1726855337.82426: done getting the remaining hosts for this loop 30582 1726855337.82430: getting the next task for host managed_node3 30582 1726855337.82440: done getting next task for host managed_node3 30582 1726855337.82444: ^ task is: TASK: Setup 30582 1726855337.82447: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855337.82454: getting variables 30582 1726855337.82456: in VariableManager get_vars() 30582 1726855337.82504: Calling all_inventory to load vars for managed_node3 30582 1726855337.82507: Calling groups_inventory to load vars for managed_node3 30582 1726855337.82511: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.82523: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.82526: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.82529: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.83603: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000176e 30582 1726855337.83607: WORKER PROCESS EXITING 30582 1726855337.84535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.87013: done with get_vars() 30582 1726855337.87051: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 14:02:17 -0400 (0:00:00.073) 0:01:14.221 ****** 30582 1726855337.87150: entering _queue_task() for managed_node3/include_tasks 30582 1726855337.87720: worker is 1 (out of 1 available) 30582 1726855337.87732: exiting _queue_task() for managed_node3/include_tasks 30582 1726855337.87744: done queuing things up, now waiting for results queue to drain 30582 1726855337.87746: waiting for pending results... 30582 1726855337.87872: running TaskExecutor() for managed_node3/TASK: Setup 30582 1726855337.88000: in run() - task 0affcc66-ac2b-aa83-7d57-000000001747 30582 1726855337.88020: variable 'ansible_search_path' from source: unknown 30582 1726855337.88026: variable 'ansible_search_path' from source: unknown 30582 1726855337.88079: variable 'lsr_setup' from source: include params 30582 1726855337.88409: variable 'lsr_setup' from source: include params 30582 1726855337.88620: variable 'omit' from source: magic vars 30582 1726855337.89195: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.89316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.89332: variable 'omit' from source: magic vars 30582 1726855337.90022: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.90357: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.90360: variable 'item' from source: unknown 30582 1726855337.90363: variable 'item' from source: unknown 30582 1726855337.90492: variable 'item' from source: unknown 30582 1726855337.90552: variable 'item' from source: unknown 30582 1726855337.90946: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.90949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.90952: variable 'omit' from source: magic vars 30582 1726855337.91332: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.91602: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.91605: variable 'item' from source: unknown 30582 1726855337.91608: variable 'item' from source: unknown 30582 1726855337.91620: variable 'item' from source: unknown 30582 1726855337.91686: variable 'item' from source: unknown 30582 1726855337.91985: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855337.91990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855337.91993: variable 'omit' from source: magic vars 30582 1726855337.92251: variable 'ansible_distribution_major_version' from source: facts 30582 1726855337.92379: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855337.92391: variable 'item' from source: unknown 30582 1726855337.92534: variable 'item' from source: unknown 30582 1726855337.92573: variable 'item' from source: unknown 30582 1726855337.92802: variable 'item' from source: unknown 30582 1726855337.92858: dumping result to json 30582 1726855337.92861: done dumping result, returning 30582 1726855337.92864: done running TaskExecutor() for managed_node3/TASK: Setup [0affcc66-ac2b-aa83-7d57-000000001747] 30582 1726855337.92870: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001747 30582 1726855337.93050: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001747 30582 1726855337.93053: WORKER PROCESS EXITING 30582 1726855337.93148: no more pending results, returning what we have 30582 1726855337.93154: in VariableManager get_vars() 30582 1726855337.93203: Calling all_inventory to load vars for managed_node3 30582 1726855337.93206: Calling groups_inventory to load vars for managed_node3 30582 1726855337.93210: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855337.93225: Calling all_plugins_play to load vars for managed_node3 30582 1726855337.93229: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855337.93232: Calling groups_plugins_play to load vars for managed_node3 30582 1726855337.96910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855337.99839: done with get_vars() 30582 1726855337.99873: variable 'ansible_search_path' from source: unknown 30582 1726855337.99875: variable 'ansible_search_path' from source: unknown 30582 1726855338.00030: variable 'ansible_search_path' from source: unknown 30582 1726855338.00032: variable 'ansible_search_path' from source: unknown 30582 1726855338.00064: variable 'ansible_search_path' from source: unknown 30582 1726855338.00065: variable 'ansible_search_path' from source: unknown 30582 1726855338.00099: we have included files to process 30582 1726855338.00100: generating all_blocks data 30582 1726855338.00102: done generating all_blocks data 30582 1726855338.00220: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855338.00222: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855338.00225: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855338.00778: done processing included file 30582 1726855338.00781: iterating over new_blocks loaded from include file 30582 1726855338.00782: in VariableManager get_vars() 30582 1726855338.01006: done with get_vars() 30582 1726855338.01008: filtering new block on tags 30582 1726855338.01044: done filtering new block on tags 30582 1726855338.01046: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30582 1726855338.01051: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855338.01052: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855338.01056: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855338.01456: done processing included file 30582 1726855338.01458: iterating over new_blocks loaded from include file 30582 1726855338.01460: in VariableManager get_vars() 30582 1726855338.01478: done with get_vars() 30582 1726855338.01480: filtering new block on tags 30582 1726855338.01608: done filtering new block on tags 30582 1726855338.01611: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node3 => (item=tasks/activate_profile.yml) 30582 1726855338.01615: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30582 1726855338.01616: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30582 1726855338.01619: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30582 1726855338.01710: done processing included file 30582 1726855338.01712: iterating over new_blocks loaded from include file 30582 1726855338.01713: in VariableManager get_vars() 30582 1726855338.01727: done with get_vars() 30582 1726855338.01729: filtering new block on tags 30582 1726855338.01750: done filtering new block on tags 30582 1726855338.01752: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node3 => (item=tasks/remove_profile.yml) 30582 1726855338.01755: extending task lists for all hosts with included blocks 30582 1726855338.04018: done extending task lists 30582 1726855338.04026: done processing included files 30582 1726855338.04027: results queue empty 30582 1726855338.04028: checking for any_errors_fatal 30582 1726855338.04032: done checking for any_errors_fatal 30582 1726855338.04033: checking for max_fail_percentage 30582 1726855338.04034: done checking for max_fail_percentage 30582 1726855338.04035: checking to see if all hosts have failed and the running result is not ok 30582 1726855338.04036: done checking to see if all hosts have failed 30582 1726855338.04036: getting the remaining hosts for this loop 30582 1726855338.04038: done getting the remaining hosts for this loop 30582 1726855338.04040: getting the next task for host managed_node3 30582 1726855338.04045: done getting next task for host managed_node3 30582 1726855338.04047: ^ task is: TASK: Include network role 30582 1726855338.04050: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855338.04053: getting variables 30582 1726855338.04054: in VariableManager get_vars() 30582 1726855338.04066: Calling all_inventory to load vars for managed_node3 30582 1726855338.04071: Calling groups_inventory to load vars for managed_node3 30582 1726855338.04073: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855338.04078: Calling all_plugins_play to load vars for managed_node3 30582 1726855338.04080: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855338.04082: Calling groups_plugins_play to load vars for managed_node3 30582 1726855338.07217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855338.10118: done with get_vars() 30582 1726855338.10157: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 14:02:18 -0400 (0:00:00.230) 0:01:14.452 ****** 30582 1726855338.10251: entering _queue_task() for managed_node3/include_role 30582 1726855338.10656: worker is 1 (out of 1 available) 30582 1726855338.10670: exiting _queue_task() for managed_node3/include_role 30582 1726855338.10683: done queuing things up, now waiting for results queue to drain 30582 1726855338.10801: waiting for pending results... 30582 1726855338.11034: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855338.11260: in run() - task 0affcc66-ac2b-aa83-7d57-0000000017d0 30582 1726855338.11265: variable 'ansible_search_path' from source: unknown 30582 1726855338.11269: variable 'ansible_search_path' from source: unknown 30582 1726855338.11273: calling self._execute() 30582 1726855338.11329: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855338.11340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855338.11355: variable 'omit' from source: magic vars 30582 1726855338.11797: variable 'ansible_distribution_major_version' from source: facts 30582 1726855338.11818: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855338.11835: _execute() done 30582 1726855338.11844: dumping result to json 30582 1726855338.11852: done dumping result, returning 30582 1726855338.11892: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-0000000017d0] 30582 1726855338.11896: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017d0 30582 1726855338.12213: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017d0 30582 1726855338.12217: WORKER PROCESS EXITING 30582 1726855338.12248: no more pending results, returning what we have 30582 1726855338.12253: in VariableManager get_vars() 30582 1726855338.12306: Calling all_inventory to load vars for managed_node3 30582 1726855338.12310: Calling groups_inventory to load vars for managed_node3 30582 1726855338.12313: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855338.12407: Calling all_plugins_play to load vars for managed_node3 30582 1726855338.12417: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855338.12422: Calling groups_plugins_play to load vars for managed_node3 30582 1726855338.15264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855338.18743: done with get_vars() 30582 1726855338.18776: variable 'ansible_search_path' from source: unknown 30582 1726855338.18778: variable 'ansible_search_path' from source: unknown 30582 1726855338.19198: variable 'omit' from source: magic vars 30582 1726855338.19241: variable 'omit' from source: magic vars 30582 1726855338.19257: variable 'omit' from source: magic vars 30582 1726855338.19262: we have included files to process 30582 1726855338.19263: generating all_blocks data 30582 1726855338.19264: done generating all_blocks data 30582 1726855338.19266: processing included file: fedora.linux_system_roles.network 30582 1726855338.19493: in VariableManager get_vars() 30582 1726855338.19510: done with get_vars() 30582 1726855338.19539: in VariableManager get_vars() 30582 1726855338.19558: done with get_vars() 30582 1726855338.19601: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855338.19930: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855338.20014: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855338.21115: in VariableManager get_vars() 30582 1726855338.21140: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855338.26121: iterating over new_blocks loaded from include file 30582 1726855338.26124: in VariableManager get_vars() 30582 1726855338.26147: done with get_vars() 30582 1726855338.26149: filtering new block on tags 30582 1726855338.26917: done filtering new block on tags 30582 1726855338.26922: in VariableManager get_vars() 30582 1726855338.26939: done with get_vars() 30582 1726855338.26940: filtering new block on tags 30582 1726855338.26957: done filtering new block on tags 30582 1726855338.26959: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855338.26965: extending task lists for all hosts with included blocks 30582 1726855338.27335: done extending task lists 30582 1726855338.27336: done processing included files 30582 1726855338.27337: results queue empty 30582 1726855338.27338: checking for any_errors_fatal 30582 1726855338.27342: done checking for any_errors_fatal 30582 1726855338.27342: checking for max_fail_percentage 30582 1726855338.27344: done checking for max_fail_percentage 30582 1726855338.27344: checking to see if all hosts have failed and the running result is not ok 30582 1726855338.27345: done checking to see if all hosts have failed 30582 1726855338.27346: getting the remaining hosts for this loop 30582 1726855338.27347: done getting the remaining hosts for this loop 30582 1726855338.27350: getting the next task for host managed_node3 30582 1726855338.27354: done getting next task for host managed_node3 30582 1726855338.27357: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855338.27360: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855338.27375: getting variables 30582 1726855338.27376: in VariableManager get_vars() 30582 1726855338.27392: Calling all_inventory to load vars for managed_node3 30582 1726855338.27394: Calling groups_inventory to load vars for managed_node3 30582 1726855338.27396: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855338.27402: Calling all_plugins_play to load vars for managed_node3 30582 1726855338.27405: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855338.27407: Calling groups_plugins_play to load vars for managed_node3 30582 1726855338.29866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855338.33083: done with get_vars() 30582 1726855338.33121: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:18 -0400 (0:00:00.229) 0:01:14.682 ****** 30582 1726855338.33215: entering _queue_task() for managed_node3/include_tasks 30582 1726855338.33831: worker is 1 (out of 1 available) 30582 1726855338.33845: exiting _queue_task() for managed_node3/include_tasks 30582 1726855338.33857: done queuing things up, now waiting for results queue to drain 30582 1726855338.33859: waiting for pending results... 30582 1726855338.34152: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855338.34180: in run() - task 0affcc66-ac2b-aa83-7d57-00000000183a 30582 1726855338.34205: variable 'ansible_search_path' from source: unknown 30582 1726855338.34213: variable 'ansible_search_path' from source: unknown 30582 1726855338.34265: calling self._execute() 30582 1726855338.34381: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855338.34398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855338.34466: variable 'omit' from source: magic vars 30582 1726855338.34831: variable 'ansible_distribution_major_version' from source: facts 30582 1726855338.34848: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855338.34858: _execute() done 30582 1726855338.34865: dumping result to json 30582 1726855338.34875: done dumping result, returning 30582 1726855338.34889: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-00000000183a] 30582 1726855338.34905: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183a 30582 1726855338.35186: no more pending results, returning what we have 30582 1726855338.35193: in VariableManager get_vars() 30582 1726855338.35251: Calling all_inventory to load vars for managed_node3 30582 1726855338.35254: Calling groups_inventory to load vars for managed_node3 30582 1726855338.35257: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855338.35347: Calling all_plugins_play to load vars for managed_node3 30582 1726855338.35352: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855338.35355: Calling groups_plugins_play to load vars for managed_node3 30582 1726855338.36200: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183a 30582 1726855338.36204: WORKER PROCESS EXITING 30582 1726855338.38218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855338.41794: done with get_vars() 30582 1726855338.41817: variable 'ansible_search_path' from source: unknown 30582 1726855338.41819: variable 'ansible_search_path' from source: unknown 30582 1726855338.41862: we have included files to process 30582 1726855338.41863: generating all_blocks data 30582 1726855338.41865: done generating all_blocks data 30582 1726855338.41870: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855338.41871: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855338.41874: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855338.43078: done processing included file 30582 1726855338.43080: iterating over new_blocks loaded from include file 30582 1726855338.43082: in VariableManager get_vars() 30582 1726855338.43275: done with get_vars() 30582 1726855338.43277: filtering new block on tags 30582 1726855338.43317: done filtering new block on tags 30582 1726855338.43320: in VariableManager get_vars() 30582 1726855338.43344: done with get_vars() 30582 1726855338.43345: filtering new block on tags 30582 1726855338.43400: done filtering new block on tags 30582 1726855338.43404: in VariableManager get_vars() 30582 1726855338.43427: done with get_vars() 30582 1726855338.43429: filtering new block on tags 30582 1726855338.43481: done filtering new block on tags 30582 1726855338.43484: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855338.43491: extending task lists for all hosts with included blocks 30582 1726855338.52692: done extending task lists 30582 1726855338.52694: done processing included files 30582 1726855338.52695: results queue empty 30582 1726855338.52696: checking for any_errors_fatal 30582 1726855338.52699: done checking for any_errors_fatal 30582 1726855338.52700: checking for max_fail_percentage 30582 1726855338.52702: done checking for max_fail_percentage 30582 1726855338.52703: checking to see if all hosts have failed and the running result is not ok 30582 1726855338.52703: done checking to see if all hosts have failed 30582 1726855338.52704: getting the remaining hosts for this loop 30582 1726855338.52706: done getting the remaining hosts for this loop 30582 1726855338.52708: getting the next task for host managed_node3 30582 1726855338.52713: done getting next task for host managed_node3 30582 1726855338.52716: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855338.52719: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855338.52736: getting variables 30582 1726855338.52737: in VariableManager get_vars() 30582 1726855338.52756: Calling all_inventory to load vars for managed_node3 30582 1726855338.52758: Calling groups_inventory to load vars for managed_node3 30582 1726855338.52761: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855338.52766: Calling all_plugins_play to load vars for managed_node3 30582 1726855338.52772: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855338.52775: Calling groups_plugins_play to load vars for managed_node3 30582 1726855338.53996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855338.55658: done with get_vars() 30582 1726855338.55683: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:02:18 -0400 (0:00:00.225) 0:01:14.907 ****** 30582 1726855338.55755: entering _queue_task() for managed_node3/setup 30582 1726855338.56050: worker is 1 (out of 1 available) 30582 1726855338.56066: exiting _queue_task() for managed_node3/setup 30582 1726855338.56080: done queuing things up, now waiting for results queue to drain 30582 1726855338.56082: waiting for pending results... 30582 1726855338.56331: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855338.56524: in run() - task 0affcc66-ac2b-aa83-7d57-000000001897 30582 1726855338.56545: variable 'ansible_search_path' from source: unknown 30582 1726855338.56553: variable 'ansible_search_path' from source: unknown 30582 1726855338.56596: calling self._execute() 30582 1726855338.56704: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855338.56716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855338.56735: variable 'omit' from source: magic vars 30582 1726855338.57135: variable 'ansible_distribution_major_version' from source: facts 30582 1726855338.57154: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855338.57416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855338.59592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855338.59678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855338.59726: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855338.59767: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855338.59801: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855338.59883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855338.59927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855338.59958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855338.60093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855338.60097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855338.60099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855338.60120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855338.60151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855338.60200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855338.60223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855338.60399: variable '__network_required_facts' from source: role '' defaults 30582 1726855338.60414: variable 'ansible_facts' from source: unknown 30582 1726855338.61209: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855338.61218: when evaluation is False, skipping this task 30582 1726855338.61226: _execute() done 30582 1726855338.61233: dumping result to json 30582 1726855338.61240: done dumping result, returning 30582 1726855338.61252: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-000000001897] 30582 1726855338.61262: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001897 30582 1726855338.61358: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001897 30582 1726855338.61361: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855338.61556: no more pending results, returning what we have 30582 1726855338.61560: results queue empty 30582 1726855338.61561: checking for any_errors_fatal 30582 1726855338.61563: done checking for any_errors_fatal 30582 1726855338.61564: checking for max_fail_percentage 30582 1726855338.61566: done checking for max_fail_percentage 30582 1726855338.61569: checking to see if all hosts have failed and the running result is not ok 30582 1726855338.61570: done checking to see if all hosts have failed 30582 1726855338.61571: getting the remaining hosts for this loop 30582 1726855338.61572: done getting the remaining hosts for this loop 30582 1726855338.61576: getting the next task for host managed_node3 30582 1726855338.61589: done getting next task for host managed_node3 30582 1726855338.61594: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855338.61601: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855338.61622: getting variables 30582 1726855338.61624: in VariableManager get_vars() 30582 1726855338.61785: Calling all_inventory to load vars for managed_node3 30582 1726855338.61792: Calling groups_inventory to load vars for managed_node3 30582 1726855338.61795: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855338.61806: Calling all_plugins_play to load vars for managed_node3 30582 1726855338.61810: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855338.61818: Calling groups_plugins_play to load vars for managed_node3 30582 1726855338.63177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855338.64582: done with get_vars() 30582 1726855338.64614: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:02:18 -0400 (0:00:00.089) 0:01:14.997 ****** 30582 1726855338.64719: entering _queue_task() for managed_node3/stat 30582 1726855338.64996: worker is 1 (out of 1 available) 30582 1726855338.65012: exiting _queue_task() for managed_node3/stat 30582 1726855338.65024: done queuing things up, now waiting for results queue to drain 30582 1726855338.65026: waiting for pending results... 30582 1726855338.65222: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855338.65345: in run() - task 0affcc66-ac2b-aa83-7d57-000000001899 30582 1726855338.65359: variable 'ansible_search_path' from source: unknown 30582 1726855338.65363: variable 'ansible_search_path' from source: unknown 30582 1726855338.65393: calling self._execute() 30582 1726855338.65464: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855338.65469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855338.65479: variable 'omit' from source: magic vars 30582 1726855338.65757: variable 'ansible_distribution_major_version' from source: facts 30582 1726855338.65768: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855338.65888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855338.66085: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855338.66120: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855338.66176: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855338.66208: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855338.66273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855338.66293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855338.66313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855338.66333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855338.66408: variable '__network_is_ostree' from source: set_fact 30582 1726855338.66411: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855338.66414: when evaluation is False, skipping this task 30582 1726855338.66417: _execute() done 30582 1726855338.66423: dumping result to json 30582 1726855338.66428: done dumping result, returning 30582 1726855338.66434: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-000000001899] 30582 1726855338.66439: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001899 30582 1726855338.66530: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001899 30582 1726855338.66533: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855338.66617: no more pending results, returning what we have 30582 1726855338.66620: results queue empty 30582 1726855338.66621: checking for any_errors_fatal 30582 1726855338.66629: done checking for any_errors_fatal 30582 1726855338.66630: checking for max_fail_percentage 30582 1726855338.66631: done checking for max_fail_percentage 30582 1726855338.66632: checking to see if all hosts have failed and the running result is not ok 30582 1726855338.66633: done checking to see if all hosts have failed 30582 1726855338.66634: getting the remaining hosts for this loop 30582 1726855338.66635: done getting the remaining hosts for this loop 30582 1726855338.66639: getting the next task for host managed_node3 30582 1726855338.66650: done getting next task for host managed_node3 30582 1726855338.66653: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855338.66660: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855338.66681: getting variables 30582 1726855338.66682: in VariableManager get_vars() 30582 1726855338.66728: Calling all_inventory to load vars for managed_node3 30582 1726855338.66731: Calling groups_inventory to load vars for managed_node3 30582 1726855338.66733: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855338.66742: Calling all_plugins_play to load vars for managed_node3 30582 1726855338.66745: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855338.66748: Calling groups_plugins_play to load vars for managed_node3 30582 1726855338.68003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855338.68918: done with get_vars() 30582 1726855338.68940: done getting variables 30582 1726855338.68990: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:02:18 -0400 (0:00:00.043) 0:01:15.040 ****** 30582 1726855338.69020: entering _queue_task() for managed_node3/set_fact 30582 1726855338.69291: worker is 1 (out of 1 available) 30582 1726855338.69306: exiting _queue_task() for managed_node3/set_fact 30582 1726855338.69318: done queuing things up, now waiting for results queue to drain 30582 1726855338.69320: waiting for pending results... 30582 1726855338.69509: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855338.69617: in run() - task 0affcc66-ac2b-aa83-7d57-00000000189a 30582 1726855338.69630: variable 'ansible_search_path' from source: unknown 30582 1726855338.69634: variable 'ansible_search_path' from source: unknown 30582 1726855338.69665: calling self._execute() 30582 1726855338.69742: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855338.69747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855338.69755: variable 'omit' from source: magic vars 30582 1726855338.70039: variable 'ansible_distribution_major_version' from source: facts 30582 1726855338.70049: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855338.70169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855338.70369: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855338.70407: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855338.70468: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855338.70498: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855338.70565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855338.70586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855338.70605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855338.70624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855338.70696: variable '__network_is_ostree' from source: set_fact 30582 1726855338.70702: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855338.70705: when evaluation is False, skipping this task 30582 1726855338.70708: _execute() done 30582 1726855338.70710: dumping result to json 30582 1726855338.70714: done dumping result, returning 30582 1726855338.70722: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-00000000189a] 30582 1726855338.70727: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000189a 30582 1726855338.70814: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000189a 30582 1726855338.70816: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855338.70891: no more pending results, returning what we have 30582 1726855338.70895: results queue empty 30582 1726855338.70896: checking for any_errors_fatal 30582 1726855338.70903: done checking for any_errors_fatal 30582 1726855338.70904: checking for max_fail_percentage 30582 1726855338.70906: done checking for max_fail_percentage 30582 1726855338.70907: checking to see if all hosts have failed and the running result is not ok 30582 1726855338.70908: done checking to see if all hosts have failed 30582 1726855338.70908: getting the remaining hosts for this loop 30582 1726855338.70910: done getting the remaining hosts for this loop 30582 1726855338.70914: getting the next task for host managed_node3 30582 1726855338.70924: done getting next task for host managed_node3 30582 1726855338.70928: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855338.70934: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855338.70954: getting variables 30582 1726855338.70956: in VariableManager get_vars() 30582 1726855338.70995: Calling all_inventory to load vars for managed_node3 30582 1726855338.70998: Calling groups_inventory to load vars for managed_node3 30582 1726855338.71000: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855338.71010: Calling all_plugins_play to load vars for managed_node3 30582 1726855338.71013: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855338.71015: Calling groups_plugins_play to load vars for managed_node3 30582 1726855338.71948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855338.72820: done with get_vars() 30582 1726855338.72838: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:02:18 -0400 (0:00:00.038) 0:01:15.079 ****** 30582 1726855338.72909: entering _queue_task() for managed_node3/service_facts 30582 1726855338.73158: worker is 1 (out of 1 available) 30582 1726855338.73172: exiting _queue_task() for managed_node3/service_facts 30582 1726855338.73184: done queuing things up, now waiting for results queue to drain 30582 1726855338.73186: waiting for pending results... 30582 1726855338.73365: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855338.73477: in run() - task 0affcc66-ac2b-aa83-7d57-00000000189c 30582 1726855338.73491: variable 'ansible_search_path' from source: unknown 30582 1726855338.73495: variable 'ansible_search_path' from source: unknown 30582 1726855338.73526: calling self._execute() 30582 1726855338.73595: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855338.73600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855338.73607: variable 'omit' from source: magic vars 30582 1726855338.73882: variable 'ansible_distribution_major_version' from source: facts 30582 1726855338.73894: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855338.73899: variable 'omit' from source: magic vars 30582 1726855338.73953: variable 'omit' from source: magic vars 30582 1726855338.73982: variable 'omit' from source: magic vars 30582 1726855338.74014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855338.74041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855338.74057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855338.74075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855338.74085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855338.74111: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855338.74115: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855338.74117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855338.74192: Set connection var ansible_timeout to 10 30582 1726855338.74196: Set connection var ansible_connection to ssh 30582 1726855338.74201: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855338.74206: Set connection var ansible_pipelining to False 30582 1726855338.74212: Set connection var ansible_shell_executable to /bin/sh 30582 1726855338.74214: Set connection var ansible_shell_type to sh 30582 1726855338.74231: variable 'ansible_shell_executable' from source: unknown 30582 1726855338.74233: variable 'ansible_connection' from source: unknown 30582 1726855338.74236: variable 'ansible_module_compression' from source: unknown 30582 1726855338.74238: variable 'ansible_shell_type' from source: unknown 30582 1726855338.74240: variable 'ansible_shell_executable' from source: unknown 30582 1726855338.74242: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855338.74244: variable 'ansible_pipelining' from source: unknown 30582 1726855338.74248: variable 'ansible_timeout' from source: unknown 30582 1726855338.74252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855338.74402: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855338.74407: variable 'omit' from source: magic vars 30582 1726855338.74412: starting attempt loop 30582 1726855338.74414: running the handler 30582 1726855338.74426: _low_level_execute_command(): starting 30582 1726855338.74433: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855338.74955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855338.74959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855338.74963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855338.74965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855338.75012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855338.75015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855338.75017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855338.75092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855338.76783: stdout chunk (state=3): >>>/root <<< 30582 1726855338.76874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855338.76910: stderr chunk (state=3): >>><<< 30582 1726855338.76914: stdout chunk (state=3): >>><<< 30582 1726855338.76934: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855338.76947: _low_level_execute_command(): starting 30582 1726855338.76953: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307 `" && echo ansible-tmp-1726855338.7693465-34067-59548525309307="` echo /root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307 `" ) && sleep 0' 30582 1726855338.77438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855338.77442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855338.77445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855338.77456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855338.77458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855338.77493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855338.77511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855338.77514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855338.77570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855338.79445: stdout chunk (state=3): >>>ansible-tmp-1726855338.7693465-34067-59548525309307=/root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307 <<< 30582 1726855338.79553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855338.79577: stderr chunk (state=3): >>><<< 30582 1726855338.79580: stdout chunk (state=3): >>><<< 30582 1726855338.79597: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855338.7693465-34067-59548525309307=/root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855338.79641: variable 'ansible_module_compression' from source: unknown 30582 1726855338.79678: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855338.79711: variable 'ansible_facts' from source: unknown 30582 1726855338.79773: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/AnsiballZ_service_facts.py 30582 1726855338.79873: Sending initial data 30582 1726855338.79877: Sent initial data (161 bytes) 30582 1726855338.80323: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855338.80327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855338.80329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855338.80331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855338.80333: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855338.80335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855338.80337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855338.80384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855338.80390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855338.80392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855338.80457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855338.82054: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855338.82102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855338.82175: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_gyzdh3c /root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/AnsiballZ_service_facts.py <<< 30582 1726855338.82179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/AnsiballZ_service_facts.py" <<< 30582 1726855338.82223: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_gyzdh3c" to remote "/root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/AnsiballZ_service_facts.py" <<< 30582 1726855338.83053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855338.83092: stderr chunk (state=3): >>><<< 30582 1726855338.83096: stdout chunk (state=3): >>><<< 30582 1726855338.83285: done transferring module to remote 30582 1726855338.83296: _low_level_execute_command(): starting 30582 1726855338.83299: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/ /root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/AnsiballZ_service_facts.py && sleep 0' 30582 1726855338.83912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855338.83936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855338.84017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855338.85937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855338.85941: stdout chunk (state=3): >>><<< 30582 1726855338.85943: stderr chunk (state=3): >>><<< 30582 1726855338.85946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855338.85948: _low_level_execute_command(): starting 30582 1726855338.85950: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/AnsiballZ_service_facts.py && sleep 0' 30582 1726855338.86720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855338.86793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855338.86796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855338.86799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855338.86801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855338.86803: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855338.86858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855338.86875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855338.86974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855340.38362: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30582 1726855340.38428: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855340.40113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855340.40168: stderr chunk (state=3): >>><<< 30582 1726855340.40171: stdout chunk (state=3): >>><<< 30582 1726855340.40265: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855340.42486: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855340.42493: _low_level_execute_command(): starting 30582 1726855340.42496: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855338.7693465-34067-59548525309307/ > /dev/null 2>&1 && sleep 0' 30582 1726855340.43601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855340.43643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855340.43662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855340.43681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855340.43699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855340.43859: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855340.43970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855340.44058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855340.46067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855340.46127: stderr chunk (state=3): >>><<< 30582 1726855340.46247: stdout chunk (state=3): >>><<< 30582 1726855340.46270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855340.46280: handler run complete 30582 1726855340.46804: variable 'ansible_facts' from source: unknown 30582 1726855340.47029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855340.47760: variable 'ansible_facts' from source: unknown 30582 1726855340.47927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855340.48182: attempt loop complete, returning result 30582 1726855340.48197: _execute() done 30582 1726855340.48216: dumping result to json 30582 1726855340.48293: done dumping result, returning 30582 1726855340.48309: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-00000000189c] 30582 1726855340.48320: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000189c 30582 1726855340.49689: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000189c 30582 1726855340.49693: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855340.49868: no more pending results, returning what we have 30582 1726855340.49871: results queue empty 30582 1726855340.49872: checking for any_errors_fatal 30582 1726855340.49877: done checking for any_errors_fatal 30582 1726855340.49878: checking for max_fail_percentage 30582 1726855340.49880: done checking for max_fail_percentage 30582 1726855340.49880: checking to see if all hosts have failed and the running result is not ok 30582 1726855340.49881: done checking to see if all hosts have failed 30582 1726855340.49882: getting the remaining hosts for this loop 30582 1726855340.49883: done getting the remaining hosts for this loop 30582 1726855340.49889: getting the next task for host managed_node3 30582 1726855340.49896: done getting next task for host managed_node3 30582 1726855340.49900: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855340.49907: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855340.50001: getting variables 30582 1726855340.50003: in VariableManager get_vars() 30582 1726855340.50046: Calling all_inventory to load vars for managed_node3 30582 1726855340.50049: Calling groups_inventory to load vars for managed_node3 30582 1726855340.50051: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855340.50061: Calling all_plugins_play to load vars for managed_node3 30582 1726855340.50067: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855340.50070: Calling groups_plugins_play to load vars for managed_node3 30582 1726855340.51468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855340.53098: done with get_vars() 30582 1726855340.53127: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:02:20 -0400 (0:00:01.803) 0:01:16.882 ****** 30582 1726855340.53235: entering _queue_task() for managed_node3/package_facts 30582 1726855340.53622: worker is 1 (out of 1 available) 30582 1726855340.53800: exiting _queue_task() for managed_node3/package_facts 30582 1726855340.53812: done queuing things up, now waiting for results queue to drain 30582 1726855340.53813: waiting for pending results... 30582 1726855340.53956: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855340.54150: in run() - task 0affcc66-ac2b-aa83-7d57-00000000189d 30582 1726855340.54156: variable 'ansible_search_path' from source: unknown 30582 1726855340.54158: variable 'ansible_search_path' from source: unknown 30582 1726855340.54261: calling self._execute() 30582 1726855340.54314: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855340.54324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855340.54339: variable 'omit' from source: magic vars 30582 1726855340.54739: variable 'ansible_distribution_major_version' from source: facts 30582 1726855340.54756: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855340.54768: variable 'omit' from source: magic vars 30582 1726855340.54855: variable 'omit' from source: magic vars 30582 1726855340.54894: variable 'omit' from source: magic vars 30582 1726855340.54947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855340.55022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855340.55025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855340.55030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855340.55044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855340.55077: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855340.55084: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855340.55092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855340.55494: Set connection var ansible_timeout to 10 30582 1726855340.55497: Set connection var ansible_connection to ssh 30582 1726855340.55500: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855340.55502: Set connection var ansible_pipelining to False 30582 1726855340.55504: Set connection var ansible_shell_executable to /bin/sh 30582 1726855340.55505: Set connection var ansible_shell_type to sh 30582 1726855340.55507: variable 'ansible_shell_executable' from source: unknown 30582 1726855340.55508: variable 'ansible_connection' from source: unknown 30582 1726855340.55510: variable 'ansible_module_compression' from source: unknown 30582 1726855340.55512: variable 'ansible_shell_type' from source: unknown 30582 1726855340.55514: variable 'ansible_shell_executable' from source: unknown 30582 1726855340.55515: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855340.55517: variable 'ansible_pipelining' from source: unknown 30582 1726855340.55518: variable 'ansible_timeout' from source: unknown 30582 1726855340.55520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855340.55894: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855340.55899: variable 'omit' from source: magic vars 30582 1726855340.55901: starting attempt loop 30582 1726855340.55903: running the handler 30582 1726855340.55905: _low_level_execute_command(): starting 30582 1726855340.55908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855340.57323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855340.57541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855340.57545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855340.57592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855340.57651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855340.59355: stdout chunk (state=3): >>>/root <<< 30582 1726855340.59604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855340.59641: stderr chunk (state=3): >>><<< 30582 1726855340.59650: stdout chunk (state=3): >>><<< 30582 1726855340.59679: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855340.59706: _low_level_execute_command(): starting 30582 1726855340.59804: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802 `" && echo ansible-tmp-1726855340.5969253-34140-109927176759802="` echo /root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802 `" ) && sleep 0' 30582 1726855340.61540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855340.61553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855340.61648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855340.63617: stdout chunk (state=3): >>>ansible-tmp-1726855340.5969253-34140-109927176759802=/root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802 <<< 30582 1726855340.63714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855340.63769: stderr chunk (state=3): >>><<< 30582 1726855340.63775: stdout chunk (state=3): >>><<< 30582 1726855340.63800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855340.5969253-34140-109927176759802=/root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855340.63894: variable 'ansible_module_compression' from source: unknown 30582 1726855340.63906: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855340.63975: variable 'ansible_facts' from source: unknown 30582 1726855340.64361: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/AnsiballZ_package_facts.py 30582 1726855340.65031: Sending initial data 30582 1726855340.65035: Sent initial data (162 bytes) 30582 1726855340.66206: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855340.66358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855340.66362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855340.66365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855340.66509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855340.68097: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855340.68106: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30582 1726855340.68113: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30582 1726855340.68131: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855340.68222: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855340.68284: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpa4iclvtt /root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/AnsiballZ_package_facts.py <<< 30582 1726855340.68309: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/AnsiballZ_package_facts.py" <<< 30582 1726855340.68376: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpa4iclvtt" to remote "/root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/AnsiballZ_package_facts.py" <<< 30582 1726855340.69980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855340.70021: stderr chunk (state=3): >>><<< 30582 1726855340.70025: stdout chunk (state=3): >>><<< 30582 1726855340.70040: done transferring module to remote 30582 1726855340.70049: _low_level_execute_command(): starting 30582 1726855340.70053: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/ /root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/AnsiballZ_package_facts.py && sleep 0' 30582 1726855340.70493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855340.70497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855340.70500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855340.70502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855340.70511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855340.70548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855340.70561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855340.70641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855340.72580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855340.72615: stderr chunk (state=3): >>><<< 30582 1726855340.72618: stdout chunk (state=3): >>><<< 30582 1726855340.72643: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855340.72647: _low_level_execute_command(): starting 30582 1726855340.72649: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/AnsiballZ_package_facts.py && sleep 0' 30582 1726855340.73100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855340.73104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855340.73106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855340.73108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855340.73110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855340.73114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855340.73156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855340.73159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855340.73236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855341.17746: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30582 1726855341.18192: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855341.20024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855341.20031: stdout chunk (state=3): >>><<< 30582 1726855341.20034: stderr chunk (state=3): >>><<< 30582 1726855341.20157: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855341.24838: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855341.24895: _low_level_execute_command(): starting 30582 1726855341.24898: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855340.5969253-34140-109927176759802/ > /dev/null 2>&1 && sleep 0' 30582 1726855341.25474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855341.25507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855341.25596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855341.25647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855341.25739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855341.27635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855341.27656: stdout chunk (state=3): >>><<< 30582 1726855341.27659: stderr chunk (state=3): >>><<< 30582 1726855341.27675: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855341.27892: handler run complete 30582 1726855341.28517: variable 'ansible_facts' from source: unknown 30582 1726855341.28937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.30996: variable 'ansible_facts' from source: unknown 30582 1726855341.31522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.32232: attempt loop complete, returning result 30582 1726855341.32253: _execute() done 30582 1726855341.32262: dumping result to json 30582 1726855341.32486: done dumping result, returning 30582 1726855341.32502: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-00000000189d] 30582 1726855341.32512: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000189d 30582 1726855341.34838: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000189d 30582 1726855341.34841: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855341.34997: no more pending results, returning what we have 30582 1726855341.35000: results queue empty 30582 1726855341.35001: checking for any_errors_fatal 30582 1726855341.35006: done checking for any_errors_fatal 30582 1726855341.35007: checking for max_fail_percentage 30582 1726855341.35008: done checking for max_fail_percentage 30582 1726855341.35009: checking to see if all hosts have failed and the running result is not ok 30582 1726855341.35010: done checking to see if all hosts have failed 30582 1726855341.35011: getting the remaining hosts for this loop 30582 1726855341.35012: done getting the remaining hosts for this loop 30582 1726855341.35015: getting the next task for host managed_node3 30582 1726855341.35023: done getting next task for host managed_node3 30582 1726855341.35026: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855341.35032: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855341.35044: getting variables 30582 1726855341.35045: in VariableManager get_vars() 30582 1726855341.35079: Calling all_inventory to load vars for managed_node3 30582 1726855341.35082: Calling groups_inventory to load vars for managed_node3 30582 1726855341.35084: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855341.35097: Calling all_plugins_play to load vars for managed_node3 30582 1726855341.35100: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855341.35103: Calling groups_plugins_play to load vars for managed_node3 30582 1726855341.36390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.37986: done with get_vars() 30582 1726855341.38012: done getting variables 30582 1726855341.38074: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:21 -0400 (0:00:00.848) 0:01:17.731 ****** 30582 1726855341.38118: entering _queue_task() for managed_node3/debug 30582 1726855341.38481: worker is 1 (out of 1 available) 30582 1726855341.38695: exiting _queue_task() for managed_node3/debug 30582 1726855341.38706: done queuing things up, now waiting for results queue to drain 30582 1726855341.38707: waiting for pending results... 30582 1726855341.38835: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855341.38960: in run() - task 0affcc66-ac2b-aa83-7d57-00000000183b 30582 1726855341.38983: variable 'ansible_search_path' from source: unknown 30582 1726855341.38993: variable 'ansible_search_path' from source: unknown 30582 1726855341.39039: calling self._execute() 30582 1726855341.39148: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.39152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.39158: variable 'omit' from source: magic vars 30582 1726855341.39584: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.39590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855341.39592: variable 'omit' from source: magic vars 30582 1726855341.39652: variable 'omit' from source: magic vars 30582 1726855341.39755: variable 'network_provider' from source: set_fact 30582 1726855341.39781: variable 'omit' from source: magic vars 30582 1726855341.39829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855341.39910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855341.39913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855341.39921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855341.39939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855341.39981: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855341.39994: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.40005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.40126: Set connection var ansible_timeout to 10 30582 1726855341.40129: Set connection var ansible_connection to ssh 30582 1726855341.40235: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855341.40238: Set connection var ansible_pipelining to False 30582 1726855341.40240: Set connection var ansible_shell_executable to /bin/sh 30582 1726855341.40243: Set connection var ansible_shell_type to sh 30582 1726855341.40244: variable 'ansible_shell_executable' from source: unknown 30582 1726855341.40247: variable 'ansible_connection' from source: unknown 30582 1726855341.40249: variable 'ansible_module_compression' from source: unknown 30582 1726855341.40250: variable 'ansible_shell_type' from source: unknown 30582 1726855341.40252: variable 'ansible_shell_executable' from source: unknown 30582 1726855341.40255: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.40257: variable 'ansible_pipelining' from source: unknown 30582 1726855341.40258: variable 'ansible_timeout' from source: unknown 30582 1726855341.40260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.40376: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855341.40395: variable 'omit' from source: magic vars 30582 1726855341.40406: starting attempt loop 30582 1726855341.40413: running the handler 30582 1726855341.40469: handler run complete 30582 1726855341.40486: attempt loop complete, returning result 30582 1726855341.40494: _execute() done 30582 1726855341.40499: dumping result to json 30582 1726855341.40505: done dumping result, returning 30582 1726855341.40514: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-00000000183b] 30582 1726855341.40522: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183b ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855341.40674: no more pending results, returning what we have 30582 1726855341.40678: results queue empty 30582 1726855341.40679: checking for any_errors_fatal 30582 1726855341.40688: done checking for any_errors_fatal 30582 1726855341.40689: checking for max_fail_percentage 30582 1726855341.40691: done checking for max_fail_percentage 30582 1726855341.40692: checking to see if all hosts have failed and the running result is not ok 30582 1726855341.40693: done checking to see if all hosts have failed 30582 1726855341.40693: getting the remaining hosts for this loop 30582 1726855341.40695: done getting the remaining hosts for this loop 30582 1726855341.40699: getting the next task for host managed_node3 30582 1726855341.40708: done getting next task for host managed_node3 30582 1726855341.40714: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855341.40720: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855341.40738: getting variables 30582 1726855341.40740: in VariableManager get_vars() 30582 1726855341.40998: Calling all_inventory to load vars for managed_node3 30582 1726855341.41001: Calling groups_inventory to load vars for managed_node3 30582 1726855341.41003: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855341.41014: Calling all_plugins_play to load vars for managed_node3 30582 1726855341.41017: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855341.41019: Calling groups_plugins_play to load vars for managed_node3 30582 1726855341.41601: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183b 30582 1726855341.41605: WORKER PROCESS EXITING 30582 1726855341.42623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.44306: done with get_vars() 30582 1726855341.44341: done getting variables 30582 1726855341.44405: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:21 -0400 (0:00:00.063) 0:01:17.794 ****** 30582 1726855341.44451: entering _queue_task() for managed_node3/fail 30582 1726855341.44742: worker is 1 (out of 1 available) 30582 1726855341.44756: exiting _queue_task() for managed_node3/fail 30582 1726855341.44768: done queuing things up, now waiting for results queue to drain 30582 1726855341.44770: waiting for pending results... 30582 1726855341.44960: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855341.45057: in run() - task 0affcc66-ac2b-aa83-7d57-00000000183c 30582 1726855341.45068: variable 'ansible_search_path' from source: unknown 30582 1726855341.45073: variable 'ansible_search_path' from source: unknown 30582 1726855341.45106: calling self._execute() 30582 1726855341.45180: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.45185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.45195: variable 'omit' from source: magic vars 30582 1726855341.45488: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.45498: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855341.45585: variable 'network_state' from source: role '' defaults 30582 1726855341.45595: Evaluated conditional (network_state != {}): False 30582 1726855341.45598: when evaluation is False, skipping this task 30582 1726855341.45601: _execute() done 30582 1726855341.45604: dumping result to json 30582 1726855341.45606: done dumping result, returning 30582 1726855341.45614: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-00000000183c] 30582 1726855341.45619: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183c 30582 1726855341.45736: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183c 30582 1726855341.45739: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855341.45837: no more pending results, returning what we have 30582 1726855341.45842: results queue empty 30582 1726855341.45843: checking for any_errors_fatal 30582 1726855341.45848: done checking for any_errors_fatal 30582 1726855341.45848: checking for max_fail_percentage 30582 1726855341.45850: done checking for max_fail_percentage 30582 1726855341.45851: checking to see if all hosts have failed and the running result is not ok 30582 1726855341.45852: done checking to see if all hosts have failed 30582 1726855341.45852: getting the remaining hosts for this loop 30582 1726855341.45854: done getting the remaining hosts for this loop 30582 1726855341.45859: getting the next task for host managed_node3 30582 1726855341.45866: done getting next task for host managed_node3 30582 1726855341.45869: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855341.45874: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855341.45900: getting variables 30582 1726855341.45902: in VariableManager get_vars() 30582 1726855341.45946: Calling all_inventory to load vars for managed_node3 30582 1726855341.45948: Calling groups_inventory to load vars for managed_node3 30582 1726855341.45950: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855341.45961: Calling all_plugins_play to load vars for managed_node3 30582 1726855341.45964: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855341.45966: Calling groups_plugins_play to load vars for managed_node3 30582 1726855341.47481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.49179: done with get_vars() 30582 1726855341.49206: done getting variables 30582 1726855341.49269: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:21 -0400 (0:00:00.048) 0:01:17.842 ****** 30582 1726855341.49304: entering _queue_task() for managed_node3/fail 30582 1726855341.49649: worker is 1 (out of 1 available) 30582 1726855341.49662: exiting _queue_task() for managed_node3/fail 30582 1726855341.49674: done queuing things up, now waiting for results queue to drain 30582 1726855341.49676: waiting for pending results... 30582 1726855341.50017: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855341.50194: in run() - task 0affcc66-ac2b-aa83-7d57-00000000183d 30582 1726855341.50198: variable 'ansible_search_path' from source: unknown 30582 1726855341.50200: variable 'ansible_search_path' from source: unknown 30582 1726855341.50203: calling self._execute() 30582 1726855341.50299: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.50310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.50325: variable 'omit' from source: magic vars 30582 1726855341.50832: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.50836: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855341.50949: variable 'network_state' from source: role '' defaults 30582 1726855341.50964: Evaluated conditional (network_state != {}): False 30582 1726855341.50972: when evaluation is False, skipping this task 30582 1726855341.51054: _execute() done 30582 1726855341.51058: dumping result to json 30582 1726855341.51061: done dumping result, returning 30582 1726855341.51065: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-00000000183d] 30582 1726855341.51067: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183d 30582 1726855341.51139: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183d 30582 1726855341.51142: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855341.51198: no more pending results, returning what we have 30582 1726855341.51203: results queue empty 30582 1726855341.51204: checking for any_errors_fatal 30582 1726855341.51213: done checking for any_errors_fatal 30582 1726855341.51213: checking for max_fail_percentage 30582 1726855341.51216: done checking for max_fail_percentage 30582 1726855341.51217: checking to see if all hosts have failed and the running result is not ok 30582 1726855341.51217: done checking to see if all hosts have failed 30582 1726855341.51218: getting the remaining hosts for this loop 30582 1726855341.51220: done getting the remaining hosts for this loop 30582 1726855341.51225: getting the next task for host managed_node3 30582 1726855341.51233: done getting next task for host managed_node3 30582 1726855341.51239: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855341.51245: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855341.51272: getting variables 30582 1726855341.51275: in VariableManager get_vars() 30582 1726855341.51507: Calling all_inventory to load vars for managed_node3 30582 1726855341.51510: Calling groups_inventory to load vars for managed_node3 30582 1726855341.51513: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855341.51525: Calling all_plugins_play to load vars for managed_node3 30582 1726855341.51528: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855341.51531: Calling groups_plugins_play to load vars for managed_node3 30582 1726855341.53251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.55120: done with get_vars() 30582 1726855341.55144: done getting variables 30582 1726855341.55209: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:21 -0400 (0:00:00.059) 0:01:17.902 ****** 30582 1726855341.55246: entering _queue_task() for managed_node3/fail 30582 1726855341.55892: worker is 1 (out of 1 available) 30582 1726855341.55909: exiting _queue_task() for managed_node3/fail 30582 1726855341.55919: done queuing things up, now waiting for results queue to drain 30582 1726855341.55921: waiting for pending results... 30582 1726855341.56278: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855341.56342: in run() - task 0affcc66-ac2b-aa83-7d57-00000000183e 30582 1726855341.56362: variable 'ansible_search_path' from source: unknown 30582 1726855341.56451: variable 'ansible_search_path' from source: unknown 30582 1726855341.56454: calling self._execute() 30582 1726855341.56715: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.56728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.56743: variable 'omit' from source: magic vars 30582 1726855341.57525: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.57662: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855341.57877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855341.60544: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855341.60719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855341.60936: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855341.60939: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855341.60942: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855341.61161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.61212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.61298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.61401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.61425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.61550: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.61574: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855341.61706: variable 'ansible_distribution' from source: facts 30582 1726855341.61717: variable '__network_rh_distros' from source: role '' defaults 30582 1726855341.61731: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855341.62001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.62038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.62068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.62132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.62141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.62195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.62240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.62259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.62350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.62353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.62367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.62398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.62425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.62475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.62496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.62826: variable 'network_connections' from source: include params 30582 1726855341.62836: variable 'interface' from source: play vars 30582 1726855341.62886: variable 'interface' from source: play vars 30582 1726855341.62901: variable 'network_state' from source: role '' defaults 30582 1726855341.62948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855341.63066: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855341.63097: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855341.63122: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855341.63145: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855341.63179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855341.63196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855341.63221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.63237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855341.63265: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855341.63271: when evaluation is False, skipping this task 30582 1726855341.63274: _execute() done 30582 1726855341.63278: dumping result to json 30582 1726855341.63280: done dumping result, returning 30582 1726855341.63283: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-00000000183e] 30582 1726855341.63290: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183e 30582 1726855341.63380: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183e 30582 1726855341.63382: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855341.63433: no more pending results, returning what we have 30582 1726855341.63436: results queue empty 30582 1726855341.63437: checking for any_errors_fatal 30582 1726855341.63444: done checking for any_errors_fatal 30582 1726855341.63445: checking for max_fail_percentage 30582 1726855341.63447: done checking for max_fail_percentage 30582 1726855341.63447: checking to see if all hosts have failed and the running result is not ok 30582 1726855341.63448: done checking to see if all hosts have failed 30582 1726855341.63449: getting the remaining hosts for this loop 30582 1726855341.63451: done getting the remaining hosts for this loop 30582 1726855341.63455: getting the next task for host managed_node3 30582 1726855341.63463: done getting next task for host managed_node3 30582 1726855341.63469: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855341.63474: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855341.63504: getting variables 30582 1726855341.63506: in VariableManager get_vars() 30582 1726855341.63548: Calling all_inventory to load vars for managed_node3 30582 1726855341.63551: Calling groups_inventory to load vars for managed_node3 30582 1726855341.63553: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855341.63563: Calling all_plugins_play to load vars for managed_node3 30582 1726855341.63565: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855341.63570: Calling groups_plugins_play to load vars for managed_node3 30582 1726855341.64394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.65794: done with get_vars() 30582 1726855341.65817: done getting variables 30582 1726855341.65868: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:21 -0400 (0:00:00.106) 0:01:18.008 ****** 30582 1726855341.65899: entering _queue_task() for managed_node3/dnf 30582 1726855341.66169: worker is 1 (out of 1 available) 30582 1726855341.66183: exiting _queue_task() for managed_node3/dnf 30582 1726855341.66197: done queuing things up, now waiting for results queue to drain 30582 1726855341.66199: waiting for pending results... 30582 1726855341.66382: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855341.66483: in run() - task 0affcc66-ac2b-aa83-7d57-00000000183f 30582 1726855341.66496: variable 'ansible_search_path' from source: unknown 30582 1726855341.66499: variable 'ansible_search_path' from source: unknown 30582 1726855341.66531: calling self._execute() 30582 1726855341.66603: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.66607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.66616: variable 'omit' from source: magic vars 30582 1726855341.66898: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.66908: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855341.67045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855341.69099: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855341.69146: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855341.69186: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855341.69218: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855341.69237: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855341.69298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.69330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.69350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.69378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.69390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.69478: variable 'ansible_distribution' from source: facts 30582 1726855341.69482: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.69496: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855341.69577: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855341.69663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.69680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.69699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.69724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.69735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.69762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.69783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.69801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.69824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.69834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.69863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.69882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.69901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.69924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.69934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.70040: variable 'network_connections' from source: include params 30582 1726855341.70049: variable 'interface' from source: play vars 30582 1726855341.70097: variable 'interface' from source: play vars 30582 1726855341.70144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855341.70254: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855341.70282: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855341.70309: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855341.70330: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855341.70359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855341.70376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855341.70399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.70417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855341.70461: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855341.71098: variable 'network_connections' from source: include params 30582 1726855341.71102: variable 'interface' from source: play vars 30582 1726855341.71104: variable 'interface' from source: play vars 30582 1726855341.71106: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855341.71108: when evaluation is False, skipping this task 30582 1726855341.71110: _execute() done 30582 1726855341.71112: dumping result to json 30582 1726855341.71114: done dumping result, returning 30582 1726855341.71124: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000183f] 30582 1726855341.71128: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183f 30582 1726855341.71237: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000183f 30582 1726855341.71240: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855341.71291: no more pending results, returning what we have 30582 1726855341.71294: results queue empty 30582 1726855341.71295: checking for any_errors_fatal 30582 1726855341.71300: done checking for any_errors_fatal 30582 1726855341.71301: checking for max_fail_percentage 30582 1726855341.71302: done checking for max_fail_percentage 30582 1726855341.71303: checking to see if all hosts have failed and the running result is not ok 30582 1726855341.71489: done checking to see if all hosts have failed 30582 1726855341.71491: getting the remaining hosts for this loop 30582 1726855341.71492: done getting the remaining hosts for this loop 30582 1726855341.71497: getting the next task for host managed_node3 30582 1726855341.71504: done getting next task for host managed_node3 30582 1726855341.71508: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855341.71513: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855341.71532: getting variables 30582 1726855341.71535: in VariableManager get_vars() 30582 1726855341.71571: Calling all_inventory to load vars for managed_node3 30582 1726855341.71574: Calling groups_inventory to load vars for managed_node3 30582 1726855341.71576: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855341.71584: Calling all_plugins_play to load vars for managed_node3 30582 1726855341.71593: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855341.71597: Calling groups_plugins_play to load vars for managed_node3 30582 1726855341.73065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.73968: done with get_vars() 30582 1726855341.73989: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855341.74045: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:21 -0400 (0:00:00.081) 0:01:18.090 ****** 30582 1726855341.74073: entering _queue_task() for managed_node3/yum 30582 1726855341.74339: worker is 1 (out of 1 available) 30582 1726855341.74353: exiting _queue_task() for managed_node3/yum 30582 1726855341.74364: done queuing things up, now waiting for results queue to drain 30582 1726855341.74365: waiting for pending results... 30582 1726855341.74549: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855341.74655: in run() - task 0affcc66-ac2b-aa83-7d57-000000001840 30582 1726855341.74665: variable 'ansible_search_path' from source: unknown 30582 1726855341.74671: variable 'ansible_search_path' from source: unknown 30582 1726855341.74699: calling self._execute() 30582 1726855341.74773: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.74777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.74783: variable 'omit' from source: magic vars 30582 1726855341.75197: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.75201: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855341.75364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855341.77104: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855341.77151: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855341.77179: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855341.77207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855341.77227: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855341.77285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.77316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.77335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.77362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.77374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.77441: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.77453: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855341.77457: when evaluation is False, skipping this task 30582 1726855341.77460: _execute() done 30582 1726855341.77462: dumping result to json 30582 1726855341.77469: done dumping result, returning 30582 1726855341.77474: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001840] 30582 1726855341.77476: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001840 30582 1726855341.77573: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001840 30582 1726855341.77576: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855341.77635: no more pending results, returning what we have 30582 1726855341.77639: results queue empty 30582 1726855341.77641: checking for any_errors_fatal 30582 1726855341.77647: done checking for any_errors_fatal 30582 1726855341.77648: checking for max_fail_percentage 30582 1726855341.77650: done checking for max_fail_percentage 30582 1726855341.77651: checking to see if all hosts have failed and the running result is not ok 30582 1726855341.77651: done checking to see if all hosts have failed 30582 1726855341.77652: getting the remaining hosts for this loop 30582 1726855341.77653: done getting the remaining hosts for this loop 30582 1726855341.77657: getting the next task for host managed_node3 30582 1726855341.77665: done getting next task for host managed_node3 30582 1726855341.77671: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855341.77675: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855341.77701: getting variables 30582 1726855341.77702: in VariableManager get_vars() 30582 1726855341.77742: Calling all_inventory to load vars for managed_node3 30582 1726855341.77745: Calling groups_inventory to load vars for managed_node3 30582 1726855341.77747: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855341.77756: Calling all_plugins_play to load vars for managed_node3 30582 1726855341.77759: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855341.77761: Calling groups_plugins_play to load vars for managed_node3 30582 1726855341.78618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.79505: done with get_vars() 30582 1726855341.79523: done getting variables 30582 1726855341.79590: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:21 -0400 (0:00:00.055) 0:01:18.146 ****** 30582 1726855341.79623: entering _queue_task() for managed_node3/fail 30582 1726855341.79952: worker is 1 (out of 1 available) 30582 1726855341.79966: exiting _queue_task() for managed_node3/fail 30582 1726855341.79982: done queuing things up, now waiting for results queue to drain 30582 1726855341.79984: waiting for pending results... 30582 1726855341.80394: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855341.80455: in run() - task 0affcc66-ac2b-aa83-7d57-000000001841 30582 1726855341.80479: variable 'ansible_search_path' from source: unknown 30582 1726855341.80492: variable 'ansible_search_path' from source: unknown 30582 1726855341.80546: calling self._execute() 30582 1726855341.80653: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.80671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.80690: variable 'omit' from source: magic vars 30582 1726855341.80993: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.81002: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855341.81091: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855341.81219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855341.83185: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855341.83237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855341.83267: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855341.83307: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855341.83331: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855341.83494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.83498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.83501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.83510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.83513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.83592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.83595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.83598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.83628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.83640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.83677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.83813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.83817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.83819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.83822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.83929: variable 'network_connections' from source: include params 30582 1726855341.83945: variable 'interface' from source: play vars 30582 1726855341.84015: variable 'interface' from source: play vars 30582 1726855341.84092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855341.84380: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855341.84385: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855341.84389: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855341.84391: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855341.84401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855341.84422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855341.84492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.84495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855341.84530: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855341.84766: variable 'network_connections' from source: include params 30582 1726855341.84773: variable 'interface' from source: play vars 30582 1726855341.84833: variable 'interface' from source: play vars 30582 1726855341.84863: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855341.84867: when evaluation is False, skipping this task 30582 1726855341.84869: _execute() done 30582 1726855341.84891: dumping result to json 30582 1726855341.84894: done dumping result, returning 30582 1726855341.84896: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001841] 30582 1726855341.84898: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001841 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855341.85074: no more pending results, returning what we have 30582 1726855341.85078: results queue empty 30582 1726855341.85079: checking for any_errors_fatal 30582 1726855341.85085: done checking for any_errors_fatal 30582 1726855341.85085: checking for max_fail_percentage 30582 1726855341.85089: done checking for max_fail_percentage 30582 1726855341.85090: checking to see if all hosts have failed and the running result is not ok 30582 1726855341.85091: done checking to see if all hosts have failed 30582 1726855341.85091: getting the remaining hosts for this loop 30582 1726855341.85093: done getting the remaining hosts for this loop 30582 1726855341.85097: getting the next task for host managed_node3 30582 1726855341.85104: done getting next task for host managed_node3 30582 1726855341.85108: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855341.85113: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855341.85138: getting variables 30582 1726855341.85140: in VariableManager get_vars() 30582 1726855341.85179: Calling all_inventory to load vars for managed_node3 30582 1726855341.85182: Calling groups_inventory to load vars for managed_node3 30582 1726855341.85184: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855341.85250: Calling all_plugins_play to load vars for managed_node3 30582 1726855341.85254: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855341.85259: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001841 30582 1726855341.85261: WORKER PROCESS EXITING 30582 1726855341.85264: Calling groups_plugins_play to load vars for managed_node3 30582 1726855341.86807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.88375: done with get_vars() 30582 1726855341.88414: done getting variables 30582 1726855341.88482: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:21 -0400 (0:00:00.088) 0:01:18.235 ****** 30582 1726855341.88527: entering _queue_task() for managed_node3/package 30582 1726855341.88918: worker is 1 (out of 1 available) 30582 1726855341.88930: exiting _queue_task() for managed_node3/package 30582 1726855341.88942: done queuing things up, now waiting for results queue to drain 30582 1726855341.88944: waiting for pending results... 30582 1726855341.89250: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855341.89390: in run() - task 0affcc66-ac2b-aa83-7d57-000000001842 30582 1726855341.89405: variable 'ansible_search_path' from source: unknown 30582 1726855341.89408: variable 'ansible_search_path' from source: unknown 30582 1726855341.89486: calling self._execute() 30582 1726855341.89554: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855341.89558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855341.89567: variable 'omit' from source: magic vars 30582 1726855341.90030: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.90033: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855341.90195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855341.90474: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855341.90524: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855341.90558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855341.90638: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855341.90755: variable 'network_packages' from source: role '' defaults 30582 1726855341.90894: variable '__network_provider_setup' from source: role '' defaults 30582 1726855341.90902: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855341.90952: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855341.91003: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855341.91020: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855341.91210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855341.92723: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855341.92769: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855341.92801: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855341.92826: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855341.92848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855341.92910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.92931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.92948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.92980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.92993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.93025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.93040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.93057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.93089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.93100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.93258: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855341.93422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.93425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.93427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.93448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.93459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.93558: variable 'ansible_python' from source: facts 30582 1726855341.93609: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855341.93799: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855341.93802: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855341.93848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.93874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.93900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.93933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.94020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.94023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855341.94034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855341.94037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.94072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855341.94089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855341.94233: variable 'network_connections' from source: include params 30582 1726855341.94240: variable 'interface' from source: play vars 30582 1726855341.94332: variable 'interface' from source: play vars 30582 1726855341.94428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855341.94459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855341.94486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855341.94517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855341.94567: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855341.94799: variable 'network_connections' from source: include params 30582 1726855341.94802: variable 'interface' from source: play vars 30582 1726855341.94879: variable 'interface' from source: play vars 30582 1726855341.94922: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855341.94977: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855341.95179: variable 'network_connections' from source: include params 30582 1726855341.95183: variable 'interface' from source: play vars 30582 1726855341.95231: variable 'interface' from source: play vars 30582 1726855341.95249: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855341.95309: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855341.95512: variable 'network_connections' from source: include params 30582 1726855341.95515: variable 'interface' from source: play vars 30582 1726855341.95562: variable 'interface' from source: play vars 30582 1726855341.95610: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855341.95653: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855341.95659: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855341.95704: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855341.95839: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855341.96138: variable 'network_connections' from source: include params 30582 1726855341.96142: variable 'interface' from source: play vars 30582 1726855341.96186: variable 'interface' from source: play vars 30582 1726855341.96197: variable 'ansible_distribution' from source: facts 30582 1726855341.96199: variable '__network_rh_distros' from source: role '' defaults 30582 1726855341.96202: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.96226: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855341.96336: variable 'ansible_distribution' from source: facts 30582 1726855341.96339: variable '__network_rh_distros' from source: role '' defaults 30582 1726855341.96344: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.96352: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855341.96462: variable 'ansible_distribution' from source: facts 30582 1726855341.96465: variable '__network_rh_distros' from source: role '' defaults 30582 1726855341.96473: variable 'ansible_distribution_major_version' from source: facts 30582 1726855341.96501: variable 'network_provider' from source: set_fact 30582 1726855341.96512: variable 'ansible_facts' from source: unknown 30582 1726855341.97151: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855341.97154: when evaluation is False, skipping this task 30582 1726855341.97157: _execute() done 30582 1726855341.97159: dumping result to json 30582 1726855341.97161: done dumping result, returning 30582 1726855341.97227: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000001842] 30582 1726855341.97229: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001842 30582 1726855341.97301: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001842 30582 1726855341.97304: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855341.97380: no more pending results, returning what we have 30582 1726855341.97384: results queue empty 30582 1726855341.97385: checking for any_errors_fatal 30582 1726855341.97394: done checking for any_errors_fatal 30582 1726855341.97395: checking for max_fail_percentage 30582 1726855341.97397: done checking for max_fail_percentage 30582 1726855341.97398: checking to see if all hosts have failed and the running result is not ok 30582 1726855341.97399: done checking to see if all hosts have failed 30582 1726855341.97399: getting the remaining hosts for this loop 30582 1726855341.97401: done getting the remaining hosts for this loop 30582 1726855341.97405: getting the next task for host managed_node3 30582 1726855341.97412: done getting next task for host managed_node3 30582 1726855341.97416: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855341.97420: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855341.97442: getting variables 30582 1726855341.97444: in VariableManager get_vars() 30582 1726855341.97533: Calling all_inventory to load vars for managed_node3 30582 1726855341.97536: Calling groups_inventory to load vars for managed_node3 30582 1726855341.97538: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855341.97547: Calling all_plugins_play to load vars for managed_node3 30582 1726855341.97550: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855341.97552: Calling groups_plugins_play to load vars for managed_node3 30582 1726855341.98826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855341.99706: done with get_vars() 30582 1726855341.99725: done getting variables 30582 1726855341.99769: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:21 -0400 (0:00:00.112) 0:01:18.347 ****** 30582 1726855341.99798: entering _queue_task() for managed_node3/package 30582 1726855342.00064: worker is 1 (out of 1 available) 30582 1726855342.00079: exiting _queue_task() for managed_node3/package 30582 1726855342.00092: done queuing things up, now waiting for results queue to drain 30582 1726855342.00094: waiting for pending results... 30582 1726855342.00378: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855342.00463: in run() - task 0affcc66-ac2b-aa83-7d57-000000001843 30582 1726855342.00482: variable 'ansible_search_path' from source: unknown 30582 1726855342.00693: variable 'ansible_search_path' from source: unknown 30582 1726855342.00696: calling self._execute() 30582 1726855342.00699: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855342.00701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855342.00705: variable 'omit' from source: magic vars 30582 1726855342.01031: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.01055: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855342.01255: variable 'network_state' from source: role '' defaults 30582 1726855342.01258: Evaluated conditional (network_state != {}): False 30582 1726855342.01260: when evaluation is False, skipping this task 30582 1726855342.01261: _execute() done 30582 1726855342.01264: dumping result to json 30582 1726855342.01266: done dumping result, returning 30582 1726855342.01269: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001843] 30582 1726855342.01271: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001843 30582 1726855342.01342: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001843 30582 1726855342.01346: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855342.01409: no more pending results, returning what we have 30582 1726855342.01413: results queue empty 30582 1726855342.01415: checking for any_errors_fatal 30582 1726855342.01423: done checking for any_errors_fatal 30582 1726855342.01424: checking for max_fail_percentage 30582 1726855342.01426: done checking for max_fail_percentage 30582 1726855342.01427: checking to see if all hosts have failed and the running result is not ok 30582 1726855342.01428: done checking to see if all hosts have failed 30582 1726855342.01429: getting the remaining hosts for this loop 30582 1726855342.01431: done getting the remaining hosts for this loop 30582 1726855342.01434: getting the next task for host managed_node3 30582 1726855342.01443: done getting next task for host managed_node3 30582 1726855342.01447: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855342.01454: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855342.01481: getting variables 30582 1726855342.01484: in VariableManager get_vars() 30582 1726855342.01532: Calling all_inventory to load vars for managed_node3 30582 1726855342.01535: Calling groups_inventory to load vars for managed_node3 30582 1726855342.01538: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855342.01550: Calling all_plugins_play to load vars for managed_node3 30582 1726855342.01553: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855342.01557: Calling groups_plugins_play to load vars for managed_node3 30582 1726855342.02855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855342.03730: done with get_vars() 30582 1726855342.03748: done getting variables 30582 1726855342.03796: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:22 -0400 (0:00:00.040) 0:01:18.388 ****** 30582 1726855342.03821: entering _queue_task() for managed_node3/package 30582 1726855342.04086: worker is 1 (out of 1 available) 30582 1726855342.04290: exiting _queue_task() for managed_node3/package 30582 1726855342.04302: done queuing things up, now waiting for results queue to drain 30582 1726855342.04304: waiting for pending results... 30582 1726855342.04505: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855342.04576: in run() - task 0affcc66-ac2b-aa83-7d57-000000001844 30582 1726855342.04600: variable 'ansible_search_path' from source: unknown 30582 1726855342.04625: variable 'ansible_search_path' from source: unknown 30582 1726855342.04656: calling self._execute() 30582 1726855342.04842: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855342.04846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855342.04849: variable 'omit' from source: magic vars 30582 1726855342.05201: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.05211: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855342.05296: variable 'network_state' from source: role '' defaults 30582 1726855342.05305: Evaluated conditional (network_state != {}): False 30582 1726855342.05308: when evaluation is False, skipping this task 30582 1726855342.05311: _execute() done 30582 1726855342.05314: dumping result to json 30582 1726855342.05316: done dumping result, returning 30582 1726855342.05324: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001844] 30582 1726855342.05329: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001844 30582 1726855342.05428: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001844 30582 1726855342.05431: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855342.05493: no more pending results, returning what we have 30582 1726855342.05496: results queue empty 30582 1726855342.05497: checking for any_errors_fatal 30582 1726855342.05504: done checking for any_errors_fatal 30582 1726855342.05505: checking for max_fail_percentage 30582 1726855342.05507: done checking for max_fail_percentage 30582 1726855342.05508: checking to see if all hosts have failed and the running result is not ok 30582 1726855342.05508: done checking to see if all hosts have failed 30582 1726855342.05509: getting the remaining hosts for this loop 30582 1726855342.05510: done getting the remaining hosts for this loop 30582 1726855342.05514: getting the next task for host managed_node3 30582 1726855342.05523: done getting next task for host managed_node3 30582 1726855342.05526: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855342.05532: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855342.05562: getting variables 30582 1726855342.05563: in VariableManager get_vars() 30582 1726855342.05607: Calling all_inventory to load vars for managed_node3 30582 1726855342.05610: Calling groups_inventory to load vars for managed_node3 30582 1726855342.05612: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855342.05621: Calling all_plugins_play to load vars for managed_node3 30582 1726855342.05624: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855342.05626: Calling groups_plugins_play to load vars for managed_node3 30582 1726855342.06442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855342.11735: done with get_vars() 30582 1726855342.11761: done getting variables 30582 1726855342.11804: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:22 -0400 (0:00:00.080) 0:01:18.468 ****** 30582 1726855342.11827: entering _queue_task() for managed_node3/service 30582 1726855342.12115: worker is 1 (out of 1 available) 30582 1726855342.12129: exiting _queue_task() for managed_node3/service 30582 1726855342.12141: done queuing things up, now waiting for results queue to drain 30582 1726855342.12143: waiting for pending results... 30582 1726855342.12327: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855342.12436: in run() - task 0affcc66-ac2b-aa83-7d57-000000001845 30582 1726855342.12448: variable 'ansible_search_path' from source: unknown 30582 1726855342.12453: variable 'ansible_search_path' from source: unknown 30582 1726855342.12485: calling self._execute() 30582 1726855342.12551: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855342.12555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855342.12564: variable 'omit' from source: magic vars 30582 1726855342.12852: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.12861: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855342.12958: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855342.13090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855342.14609: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855342.14664: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855342.14702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855342.14729: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855342.14750: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855342.14817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.14837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.14854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.14886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.14900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.14933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.14949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.14965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.14997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.15007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.15035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.15051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.15068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.15098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.15110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.15228: variable 'network_connections' from source: include params 30582 1726855342.15239: variable 'interface' from source: play vars 30582 1726855342.15294: variable 'interface' from source: play vars 30582 1726855342.15346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855342.15465: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855342.15505: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855342.15530: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855342.15550: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855342.15582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855342.15599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855342.15616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.15636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855342.15686: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855342.15845: variable 'network_connections' from source: include params 30582 1726855342.15850: variable 'interface' from source: play vars 30582 1726855342.15901: variable 'interface' from source: play vars 30582 1726855342.15926: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855342.15929: when evaluation is False, skipping this task 30582 1726855342.15932: _execute() done 30582 1726855342.15934: dumping result to json 30582 1726855342.15936: done dumping result, returning 30582 1726855342.15944: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001845] 30582 1726855342.15949: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001845 30582 1726855342.16048: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001845 30582 1726855342.16058: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855342.16116: no more pending results, returning what we have 30582 1726855342.16120: results queue empty 30582 1726855342.16121: checking for any_errors_fatal 30582 1726855342.16129: done checking for any_errors_fatal 30582 1726855342.16129: checking for max_fail_percentage 30582 1726855342.16132: done checking for max_fail_percentage 30582 1726855342.16133: checking to see if all hosts have failed and the running result is not ok 30582 1726855342.16133: done checking to see if all hosts have failed 30582 1726855342.16134: getting the remaining hosts for this loop 30582 1726855342.16135: done getting the remaining hosts for this loop 30582 1726855342.16139: getting the next task for host managed_node3 30582 1726855342.16146: done getting next task for host managed_node3 30582 1726855342.16150: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855342.16155: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855342.16178: getting variables 30582 1726855342.16179: in VariableManager get_vars() 30582 1726855342.16232: Calling all_inventory to load vars for managed_node3 30582 1726855342.16235: Calling groups_inventory to load vars for managed_node3 30582 1726855342.16237: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855342.16246: Calling all_plugins_play to load vars for managed_node3 30582 1726855342.16249: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855342.16251: Calling groups_plugins_play to load vars for managed_node3 30582 1726855342.17099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855342.18011: done with get_vars() 30582 1726855342.18030: done getting variables 30582 1726855342.18078: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:22 -0400 (0:00:00.062) 0:01:18.530 ****** 30582 1726855342.18105: entering _queue_task() for managed_node3/service 30582 1726855342.18370: worker is 1 (out of 1 available) 30582 1726855342.18385: exiting _queue_task() for managed_node3/service 30582 1726855342.18401: done queuing things up, now waiting for results queue to drain 30582 1726855342.18403: waiting for pending results... 30582 1726855342.18600: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855342.18704: in run() - task 0affcc66-ac2b-aa83-7d57-000000001846 30582 1726855342.18715: variable 'ansible_search_path' from source: unknown 30582 1726855342.18718: variable 'ansible_search_path' from source: unknown 30582 1726855342.18751: calling self._execute() 30582 1726855342.18825: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855342.18828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855342.18837: variable 'omit' from source: magic vars 30582 1726855342.19135: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.19143: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855342.19261: variable 'network_provider' from source: set_fact 30582 1726855342.19265: variable 'network_state' from source: role '' defaults 30582 1726855342.19278: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855342.19281: variable 'omit' from source: magic vars 30582 1726855342.19328: variable 'omit' from source: magic vars 30582 1726855342.19347: variable 'network_service_name' from source: role '' defaults 30582 1726855342.19399: variable 'network_service_name' from source: role '' defaults 30582 1726855342.19470: variable '__network_provider_setup' from source: role '' defaults 30582 1726855342.19477: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855342.19526: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855342.19533: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855342.19579: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855342.19731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855342.21215: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855342.21268: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855342.21300: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855342.21326: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855342.21347: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855342.21492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.21496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.21499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.21536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.21555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.21608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.21636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.21664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.21719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.21741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.21981: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855342.22098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.22292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.22296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.22298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.22301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.22303: variable 'ansible_python' from source: facts 30582 1726855342.22321: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855342.22406: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855342.22486: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855342.22594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.22612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.22636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.22659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.22672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.22710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.22730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.22751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.22777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.22789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.22884: variable 'network_connections' from source: include params 30582 1726855342.22893: variable 'interface' from source: play vars 30582 1726855342.22944: variable 'interface' from source: play vars 30582 1726855342.23045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855342.23896: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855342.23919: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855342.23972: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855342.24025: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855342.24104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855342.24140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855342.24194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.24229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855342.24409: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855342.24910: variable 'network_connections' from source: include params 30582 1726855342.24924: variable 'interface' from source: play vars 30582 1726855342.25299: variable 'interface' from source: play vars 30582 1726855342.25302: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855342.25592: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855342.25782: variable 'network_connections' from source: include params 30582 1726855342.26000: variable 'interface' from source: play vars 30582 1726855342.26073: variable 'interface' from source: play vars 30582 1726855342.26393: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855342.26401: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855342.26878: variable 'network_connections' from source: include params 30582 1726855342.26944: variable 'interface' from source: play vars 30582 1726855342.27084: variable 'interface' from source: play vars 30582 1726855342.27222: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855342.27452: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855342.27470: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855342.27536: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855342.27965: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855342.29195: variable 'network_connections' from source: include params 30582 1726855342.29201: variable 'interface' from source: play vars 30582 1726855342.29203: variable 'interface' from source: play vars 30582 1726855342.29207: variable 'ansible_distribution' from source: facts 30582 1726855342.29209: variable '__network_rh_distros' from source: role '' defaults 30582 1726855342.29211: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.29213: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855342.29495: variable 'ansible_distribution' from source: facts 30582 1726855342.29550: variable '__network_rh_distros' from source: role '' defaults 30582 1726855342.29562: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.29581: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855342.29750: variable 'ansible_distribution' from source: facts 30582 1726855342.29764: variable '__network_rh_distros' from source: role '' defaults 30582 1726855342.29778: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.29824: variable 'network_provider' from source: set_fact 30582 1726855342.29894: variable 'omit' from source: magic vars 30582 1726855342.29897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855342.29927: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855342.29949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855342.29975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855342.29998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855342.30032: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855342.30092: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855342.30096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855342.30164: Set connection var ansible_timeout to 10 30582 1726855342.30176: Set connection var ansible_connection to ssh 30582 1726855342.30192: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855342.30209: Set connection var ansible_pipelining to False 30582 1726855342.30219: Set connection var ansible_shell_executable to /bin/sh 30582 1726855342.30226: Set connection var ansible_shell_type to sh 30582 1726855342.30255: variable 'ansible_shell_executable' from source: unknown 30582 1726855342.30264: variable 'ansible_connection' from source: unknown 30582 1726855342.30310: variable 'ansible_module_compression' from source: unknown 30582 1726855342.30313: variable 'ansible_shell_type' from source: unknown 30582 1726855342.30315: variable 'ansible_shell_executable' from source: unknown 30582 1726855342.30317: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855342.30319: variable 'ansible_pipelining' from source: unknown 30582 1726855342.30321: variable 'ansible_timeout' from source: unknown 30582 1726855342.30323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855342.30433: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855342.30526: variable 'omit' from source: magic vars 30582 1726855342.30529: starting attempt loop 30582 1726855342.30532: running the handler 30582 1726855342.30553: variable 'ansible_facts' from source: unknown 30582 1726855342.31319: _low_level_execute_command(): starting 30582 1726855342.31331: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855342.32104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855342.32163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855342.32185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855342.32210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855342.32305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855342.34030: stdout chunk (state=3): >>>/root <<< 30582 1726855342.34200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855342.34204: stdout chunk (state=3): >>><<< 30582 1726855342.34207: stderr chunk (state=3): >>><<< 30582 1726855342.34226: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855342.34330: _low_level_execute_command(): starting 30582 1726855342.34334: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661 `" && echo ansible-tmp-1726855342.3423736-34220-201086851863661="` echo /root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661 `" ) && sleep 0' 30582 1726855342.34908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855342.34923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855342.34945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855342.34964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855342.35002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855342.35019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855342.35055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855342.35119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855342.35139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855342.35171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855342.35274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855342.37194: stdout chunk (state=3): >>>ansible-tmp-1726855342.3423736-34220-201086851863661=/root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661 <<< 30582 1726855342.37355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855342.37359: stdout chunk (state=3): >>><<< 30582 1726855342.37361: stderr chunk (state=3): >>><<< 30582 1726855342.37377: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855342.3423736-34220-201086851863661=/root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855342.37415: variable 'ansible_module_compression' from source: unknown 30582 1726855342.37492: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855342.37547: variable 'ansible_facts' from source: unknown 30582 1726855342.37862: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/AnsiballZ_systemd.py 30582 1726855342.38121: Sending initial data 30582 1726855342.38125: Sent initial data (156 bytes) 30582 1726855342.38504: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855342.38517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855342.38530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855342.38552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855342.38653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855342.38671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855342.38763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855342.40356: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855342.40454: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855342.40624: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855342.40681: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp2x3adoru /root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/AnsiballZ_systemd.py <<< 30582 1726855342.40699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/AnsiballZ_systemd.py" <<< 30582 1726855342.40781: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp2x3adoru" to remote "/root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/AnsiballZ_systemd.py" <<< 30582 1726855342.42474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855342.42492: stderr chunk (state=3): >>><<< 30582 1726855342.42602: stdout chunk (state=3): >>><<< 30582 1726855342.42606: done transferring module to remote 30582 1726855342.42608: _low_level_execute_command(): starting 30582 1726855342.42611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/ /root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/AnsiballZ_systemd.py && sleep 0' 30582 1726855342.43192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855342.43206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855342.43242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855342.43343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855342.43367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855342.43492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855342.45371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855342.45399: stdout chunk (state=3): >>><<< 30582 1726855342.45402: stderr chunk (state=3): >>><<< 30582 1726855342.45496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855342.45499: _low_level_execute_command(): starting 30582 1726855342.45502: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/AnsiballZ_systemd.py && sleep 0' 30582 1726855342.46066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855342.46172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855342.46190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855342.46205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855342.46225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855342.46320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855342.75409: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10616832", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3320451072", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2191095000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855342.75446: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855342.77369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855342.77425: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855342.77429: stdout chunk (state=3): >>><<< 30582 1726855342.77435: stderr chunk (state=3): >>><<< 30582 1726855342.77477: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10616832", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3320451072", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2191095000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855342.77798: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855342.77802: _low_level_execute_command(): starting 30582 1726855342.77805: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855342.3423736-34220-201086851863661/ > /dev/null 2>&1 && sleep 0' 30582 1726855342.78311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855342.78321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855342.78333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855342.78392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855342.78396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855342.78398: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855342.78401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855342.78403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855342.78405: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855342.78407: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855342.78412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855342.78422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855342.78434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855342.78441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855342.78452: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855342.78455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855342.78564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855342.78567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855342.78570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855342.78642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855342.80796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855342.80800: stdout chunk (state=3): >>><<< 30582 1726855342.80802: stderr chunk (state=3): >>><<< 30582 1726855342.80805: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855342.80807: handler run complete 30582 1726855342.80874: attempt loop complete, returning result 30582 1726855342.80877: _execute() done 30582 1726855342.80880: dumping result to json 30582 1726855342.81031: done dumping result, returning 30582 1726855342.81035: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-000000001846] 30582 1726855342.81038: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001846 30582 1726855342.81544: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001846 30582 1726855342.81548: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855342.81612: no more pending results, returning what we have 30582 1726855342.81618: results queue empty 30582 1726855342.81619: checking for any_errors_fatal 30582 1726855342.81623: done checking for any_errors_fatal 30582 1726855342.81624: checking for max_fail_percentage 30582 1726855342.81626: done checking for max_fail_percentage 30582 1726855342.81627: checking to see if all hosts have failed and the running result is not ok 30582 1726855342.81627: done checking to see if all hosts have failed 30582 1726855342.81628: getting the remaining hosts for this loop 30582 1726855342.81629: done getting the remaining hosts for this loop 30582 1726855342.81633: getting the next task for host managed_node3 30582 1726855342.81640: done getting next task for host managed_node3 30582 1726855342.81643: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855342.81649: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855342.81670: getting variables 30582 1726855342.81672: in VariableManager get_vars() 30582 1726855342.81711: Calling all_inventory to load vars for managed_node3 30582 1726855342.81713: Calling groups_inventory to load vars for managed_node3 30582 1726855342.81715: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855342.81726: Calling all_plugins_play to load vars for managed_node3 30582 1726855342.81729: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855342.81738: Calling groups_plugins_play to load vars for managed_node3 30582 1726855342.83522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855342.86044: done with get_vars() 30582 1726855342.86082: done getting variables 30582 1726855342.86260: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:22 -0400 (0:00:00.682) 0:01:19.213 ****** 30582 1726855342.86360: entering _queue_task() for managed_node3/service 30582 1726855342.87171: worker is 1 (out of 1 available) 30582 1726855342.87185: exiting _queue_task() for managed_node3/service 30582 1726855342.87200: done queuing things up, now waiting for results queue to drain 30582 1726855342.87206: waiting for pending results... 30582 1726855342.87805: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855342.87812: in run() - task 0affcc66-ac2b-aa83-7d57-000000001847 30582 1726855342.87816: variable 'ansible_search_path' from source: unknown 30582 1726855342.87818: variable 'ansible_search_path' from source: unknown 30582 1726855342.87821: calling self._execute() 30582 1726855342.87824: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855342.87826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855342.87829: variable 'omit' from source: magic vars 30582 1726855342.88220: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.88240: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855342.88380: variable 'network_provider' from source: set_fact 30582 1726855342.88385: Evaluated conditional (network_provider == "nm"): True 30582 1726855342.88491: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855342.88584: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855342.88758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855342.90720: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855342.90769: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855342.90797: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855342.90826: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855342.90848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855342.91093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.91097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.91099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.91102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.91105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.91108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.91110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.91112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.91144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.91156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.91194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855342.91216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855342.91235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.91259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855342.91272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855342.91377: variable 'network_connections' from source: include params 30582 1726855342.91390: variable 'interface' from source: play vars 30582 1726855342.91447: variable 'interface' from source: play vars 30582 1726855342.91499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855342.91616: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855342.91644: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855342.91671: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855342.91692: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855342.91723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855342.91739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855342.91756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855342.91775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855342.91817: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855342.92097: variable 'network_connections' from source: include params 30582 1726855342.92101: variable 'interface' from source: play vars 30582 1726855342.92103: variable 'interface' from source: play vars 30582 1726855342.92123: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855342.92126: when evaluation is False, skipping this task 30582 1726855342.92128: _execute() done 30582 1726855342.92131: dumping result to json 30582 1726855342.92133: done dumping result, returning 30582 1726855342.92143: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-000000001847] 30582 1726855342.92154: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001847 30582 1726855342.92393: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001847 30582 1726855342.92396: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855342.92442: no more pending results, returning what we have 30582 1726855342.92446: results queue empty 30582 1726855342.92447: checking for any_errors_fatal 30582 1726855342.92464: done checking for any_errors_fatal 30582 1726855342.92465: checking for max_fail_percentage 30582 1726855342.92469: done checking for max_fail_percentage 30582 1726855342.92470: checking to see if all hosts have failed and the running result is not ok 30582 1726855342.92470: done checking to see if all hosts have failed 30582 1726855342.92471: getting the remaining hosts for this loop 30582 1726855342.92472: done getting the remaining hosts for this loop 30582 1726855342.92476: getting the next task for host managed_node3 30582 1726855342.92483: done getting next task for host managed_node3 30582 1726855342.92489: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855342.92494: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855342.92516: getting variables 30582 1726855342.92518: in VariableManager get_vars() 30582 1726855342.92553: Calling all_inventory to load vars for managed_node3 30582 1726855342.92555: Calling groups_inventory to load vars for managed_node3 30582 1726855342.92557: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855342.92568: Calling all_plugins_play to load vars for managed_node3 30582 1726855342.92571: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855342.92574: Calling groups_plugins_play to load vars for managed_node3 30582 1726855342.93453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855342.94335: done with get_vars() 30582 1726855342.94357: done getting variables 30582 1726855342.94411: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:22 -0400 (0:00:00.080) 0:01:19.294 ****** 30582 1726855342.94436: entering _queue_task() for managed_node3/service 30582 1726855342.94702: worker is 1 (out of 1 available) 30582 1726855342.94718: exiting _queue_task() for managed_node3/service 30582 1726855342.94731: done queuing things up, now waiting for results queue to drain 30582 1726855342.94733: waiting for pending results... 30582 1726855342.94924: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855342.95037: in run() - task 0affcc66-ac2b-aa83-7d57-000000001848 30582 1726855342.95049: variable 'ansible_search_path' from source: unknown 30582 1726855342.95052: variable 'ansible_search_path' from source: unknown 30582 1726855342.95086: calling self._execute() 30582 1726855342.95155: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855342.95158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855342.95167: variable 'omit' from source: magic vars 30582 1726855342.95458: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.95468: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855342.95552: variable 'network_provider' from source: set_fact 30582 1726855342.95557: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855342.95560: when evaluation is False, skipping this task 30582 1726855342.95562: _execute() done 30582 1726855342.95565: dumping result to json 30582 1726855342.95572: done dumping result, returning 30582 1726855342.95578: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-000000001848] 30582 1726855342.95584: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001848 30582 1726855342.95674: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001848 30582 1726855342.95677: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855342.95756: no more pending results, returning what we have 30582 1726855342.95760: results queue empty 30582 1726855342.95761: checking for any_errors_fatal 30582 1726855342.95770: done checking for any_errors_fatal 30582 1726855342.95770: checking for max_fail_percentage 30582 1726855342.95772: done checking for max_fail_percentage 30582 1726855342.95773: checking to see if all hosts have failed and the running result is not ok 30582 1726855342.95773: done checking to see if all hosts have failed 30582 1726855342.95774: getting the remaining hosts for this loop 30582 1726855342.95775: done getting the remaining hosts for this loop 30582 1726855342.95779: getting the next task for host managed_node3 30582 1726855342.95786: done getting next task for host managed_node3 30582 1726855342.95795: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855342.95800: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855342.95819: getting variables 30582 1726855342.95820: in VariableManager get_vars() 30582 1726855342.95856: Calling all_inventory to load vars for managed_node3 30582 1726855342.95859: Calling groups_inventory to load vars for managed_node3 30582 1726855342.95861: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855342.95869: Calling all_plugins_play to load vars for managed_node3 30582 1726855342.95871: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855342.95874: Calling groups_plugins_play to load vars for managed_node3 30582 1726855342.96747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855342.97633: done with get_vars() 30582 1726855342.97650: done getting variables 30582 1726855342.97696: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:22 -0400 (0:00:00.032) 0:01:19.327 ****** 30582 1726855342.97731: entering _queue_task() for managed_node3/copy 30582 1726855342.97993: worker is 1 (out of 1 available) 30582 1726855342.98006: exiting _queue_task() for managed_node3/copy 30582 1726855342.98018: done queuing things up, now waiting for results queue to drain 30582 1726855342.98020: waiting for pending results... 30582 1726855342.98209: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855342.98303: in run() - task 0affcc66-ac2b-aa83-7d57-000000001849 30582 1726855342.98313: variable 'ansible_search_path' from source: unknown 30582 1726855342.98317: variable 'ansible_search_path' from source: unknown 30582 1726855342.98344: calling self._execute() 30582 1726855342.98416: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855342.98420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855342.98429: variable 'omit' from source: magic vars 30582 1726855342.98708: variable 'ansible_distribution_major_version' from source: facts 30582 1726855342.98718: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855342.98803: variable 'network_provider' from source: set_fact 30582 1726855342.98809: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855342.98811: when evaluation is False, skipping this task 30582 1726855342.98814: _execute() done 30582 1726855342.98817: dumping result to json 30582 1726855342.98819: done dumping result, returning 30582 1726855342.98828: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-000000001849] 30582 1726855342.98833: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001849 30582 1726855342.98924: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001849 30582 1726855342.98927: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855342.98973: no more pending results, returning what we have 30582 1726855342.98976: results queue empty 30582 1726855342.98977: checking for any_errors_fatal 30582 1726855342.98984: done checking for any_errors_fatal 30582 1726855342.98985: checking for max_fail_percentage 30582 1726855342.98989: done checking for max_fail_percentage 30582 1726855342.98990: checking to see if all hosts have failed and the running result is not ok 30582 1726855342.98991: done checking to see if all hosts have failed 30582 1726855342.98991: getting the remaining hosts for this loop 30582 1726855342.98993: done getting the remaining hosts for this loop 30582 1726855342.98996: getting the next task for host managed_node3 30582 1726855342.99010: done getting next task for host managed_node3 30582 1726855342.99013: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855342.99019: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855342.99043: getting variables 30582 1726855342.99044: in VariableManager get_vars() 30582 1726855342.99082: Calling all_inventory to load vars for managed_node3 30582 1726855342.99084: Calling groups_inventory to load vars for managed_node3 30582 1726855342.99086: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855342.99101: Calling all_plugins_play to load vars for managed_node3 30582 1726855342.99104: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855342.99107: Calling groups_plugins_play to load vars for managed_node3 30582 1726855343.00324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855343.01697: done with get_vars() 30582 1726855343.01715: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:23 -0400 (0:00:00.040) 0:01:19.367 ****** 30582 1726855343.01778: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855343.02041: worker is 1 (out of 1 available) 30582 1726855343.02057: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855343.02068: done queuing things up, now waiting for results queue to drain 30582 1726855343.02070: waiting for pending results... 30582 1726855343.02259: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855343.02360: in run() - task 0affcc66-ac2b-aa83-7d57-00000000184a 30582 1726855343.02374: variable 'ansible_search_path' from source: unknown 30582 1726855343.02378: variable 'ansible_search_path' from source: unknown 30582 1726855343.02409: calling self._execute() 30582 1726855343.02478: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.02482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.02492: variable 'omit' from source: magic vars 30582 1726855343.02774: variable 'ansible_distribution_major_version' from source: facts 30582 1726855343.02784: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855343.02792: variable 'omit' from source: magic vars 30582 1726855343.02838: variable 'omit' from source: magic vars 30582 1726855343.02952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855343.04456: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855343.04508: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855343.04533: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855343.04561: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855343.04586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855343.04648: variable 'network_provider' from source: set_fact 30582 1726855343.04748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855343.04766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855343.04789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855343.04816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855343.04827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855343.04882: variable 'omit' from source: magic vars 30582 1726855343.04957: variable 'omit' from source: magic vars 30582 1726855343.05032: variable 'network_connections' from source: include params 30582 1726855343.05043: variable 'interface' from source: play vars 30582 1726855343.05091: variable 'interface' from source: play vars 30582 1726855343.05200: variable 'omit' from source: magic vars 30582 1726855343.05208: variable '__lsr_ansible_managed' from source: task vars 30582 1726855343.05251: variable '__lsr_ansible_managed' from source: task vars 30582 1726855343.05391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855343.05534: Loaded config def from plugin (lookup/template) 30582 1726855343.05537: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855343.05559: File lookup term: get_ansible_managed.j2 30582 1726855343.05562: variable 'ansible_search_path' from source: unknown 30582 1726855343.05566: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855343.05579: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855343.05594: variable 'ansible_search_path' from source: unknown 30582 1726855343.09168: variable 'ansible_managed' from source: unknown 30582 1726855343.09251: variable 'omit' from source: magic vars 30582 1726855343.09275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855343.09297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855343.09311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855343.09323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855343.09332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855343.09356: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855343.09360: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.09362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.09427: Set connection var ansible_timeout to 10 30582 1726855343.09430: Set connection var ansible_connection to ssh 30582 1726855343.09435: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855343.09440: Set connection var ansible_pipelining to False 30582 1726855343.09444: Set connection var ansible_shell_executable to /bin/sh 30582 1726855343.09447: Set connection var ansible_shell_type to sh 30582 1726855343.09467: variable 'ansible_shell_executable' from source: unknown 30582 1726855343.09473: variable 'ansible_connection' from source: unknown 30582 1726855343.09475: variable 'ansible_module_compression' from source: unknown 30582 1726855343.09477: variable 'ansible_shell_type' from source: unknown 30582 1726855343.09479: variable 'ansible_shell_executable' from source: unknown 30582 1726855343.09482: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.09488: variable 'ansible_pipelining' from source: unknown 30582 1726855343.09491: variable 'ansible_timeout' from source: unknown 30582 1726855343.09495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.09584: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855343.09598: variable 'omit' from source: magic vars 30582 1726855343.09601: starting attempt loop 30582 1726855343.09604: running the handler 30582 1726855343.09615: _low_level_execute_command(): starting 30582 1726855343.09621: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855343.10122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.10126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.10129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.10130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.10171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855343.10192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855343.10195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.10269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855343.11980: stdout chunk (state=3): >>>/root <<< 30582 1726855343.12081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855343.12111: stderr chunk (state=3): >>><<< 30582 1726855343.12114: stdout chunk (state=3): >>><<< 30582 1726855343.12134: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855343.12144: _low_level_execute_command(): starting 30582 1726855343.12150: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780 `" && echo ansible-tmp-1726855343.1213448-34278-260629345503780="` echo /root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780 `" ) && sleep 0' 30582 1726855343.12598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855343.12602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.12605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855343.12607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855343.12609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.12659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855343.12662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855343.12664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.12727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855343.14661: stdout chunk (state=3): >>>ansible-tmp-1726855343.1213448-34278-260629345503780=/root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780 <<< 30582 1726855343.14756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855343.14785: stderr chunk (state=3): >>><<< 30582 1726855343.14790: stdout chunk (state=3): >>><<< 30582 1726855343.14806: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855343.1213448-34278-260629345503780=/root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855343.14845: variable 'ansible_module_compression' from source: unknown 30582 1726855343.14886: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855343.14928: variable 'ansible_facts' from source: unknown 30582 1726855343.15018: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/AnsiballZ_network_connections.py 30582 1726855343.15122: Sending initial data 30582 1726855343.15126: Sent initial data (168 bytes) 30582 1726855343.15574: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855343.15577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855343.15583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.15585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.15589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.15638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855343.15648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855343.15650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.15702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855343.17254: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855343.17307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855343.17371: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp8zn4id31 /root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/AnsiballZ_network_connections.py <<< 30582 1726855343.17373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/AnsiballZ_network_connections.py" <<< 30582 1726855343.17421: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp8zn4id31" to remote "/root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/AnsiballZ_network_connections.py" <<< 30582 1726855343.17427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/AnsiballZ_network_connections.py" <<< 30582 1726855343.18196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855343.18241: stderr chunk (state=3): >>><<< 30582 1726855343.18244: stdout chunk (state=3): >>><<< 30582 1726855343.18293: done transferring module to remote 30582 1726855343.18302: _low_level_execute_command(): starting 30582 1726855343.18306: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/ /root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/AnsiballZ_network_connections.py && sleep 0' 30582 1726855343.18757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855343.18761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855343.18763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.18765: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.18770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855343.18772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.18822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855343.18828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855343.18830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.18884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855343.20645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855343.20674: stderr chunk (state=3): >>><<< 30582 1726855343.20677: stdout chunk (state=3): >>><<< 30582 1726855343.20692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855343.20695: _low_level_execute_command(): starting 30582 1726855343.20702: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/AnsiballZ_network_connections.py && sleep 0' 30582 1726855343.21156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.21160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.21162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.21164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.21221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855343.21224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855343.21230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.21296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855343.50034: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855343.52770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855343.52775: stdout chunk (state=3): >>><<< 30582 1726855343.52778: stderr chunk (state=3): >>><<< 30582 1726855343.52780: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855343.52783: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855343.52785: _low_level_execute_command(): starting 30582 1726855343.52790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855343.1213448-34278-260629345503780/ > /dev/null 2>&1 && sleep 0' 30582 1726855343.53385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855343.53403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855343.53419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.53439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855343.53549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855343.53584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.53693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855343.55603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855343.55607: stdout chunk (state=3): >>><<< 30582 1726855343.55613: stderr chunk (state=3): >>><<< 30582 1726855343.55630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855343.55692: handler run complete 30582 1726855343.55695: attempt loop complete, returning result 30582 1726855343.55697: _execute() done 30582 1726855343.55698: dumping result to json 30582 1726855343.55700: done dumping result, returning 30582 1726855343.55712: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-00000000184a] 30582 1726855343.55722: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184a 30582 1726855343.55931: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184a 30582 1726855343.55934: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced 30582 1726855343.56198: no more pending results, returning what we have 30582 1726855343.56202: results queue empty 30582 1726855343.56204: checking for any_errors_fatal 30582 1726855343.56210: done checking for any_errors_fatal 30582 1726855343.56211: checking for max_fail_percentage 30582 1726855343.56213: done checking for max_fail_percentage 30582 1726855343.56214: checking to see if all hosts have failed and the running result is not ok 30582 1726855343.56214: done checking to see if all hosts have failed 30582 1726855343.56215: getting the remaining hosts for this loop 30582 1726855343.56217: done getting the remaining hosts for this loop 30582 1726855343.56220: getting the next task for host managed_node3 30582 1726855343.56228: done getting next task for host managed_node3 30582 1726855343.56231: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855343.56236: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855343.56251: getting variables 30582 1726855343.56252: in VariableManager get_vars() 30582 1726855343.56406: Calling all_inventory to load vars for managed_node3 30582 1726855343.56409: Calling groups_inventory to load vars for managed_node3 30582 1726855343.56412: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855343.56422: Calling all_plugins_play to load vars for managed_node3 30582 1726855343.56425: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855343.56428: Calling groups_plugins_play to load vars for managed_node3 30582 1726855343.57899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855343.59602: done with get_vars() 30582 1726855343.59641: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:23 -0400 (0:00:00.579) 0:01:19.947 ****** 30582 1726855343.59741: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855343.60140: worker is 1 (out of 1 available) 30582 1726855343.60153: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855343.60166: done queuing things up, now waiting for results queue to drain 30582 1726855343.60167: waiting for pending results... 30582 1726855343.60519: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855343.60632: in run() - task 0affcc66-ac2b-aa83-7d57-00000000184b 30582 1726855343.60655: variable 'ansible_search_path' from source: unknown 30582 1726855343.60664: variable 'ansible_search_path' from source: unknown 30582 1726855343.60708: calling self._execute() 30582 1726855343.60830: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.60834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.60838: variable 'omit' from source: magic vars 30582 1726855343.61238: variable 'ansible_distribution_major_version' from source: facts 30582 1726855343.61266: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855343.61485: variable 'network_state' from source: role '' defaults 30582 1726855343.61491: Evaluated conditional (network_state != {}): False 30582 1726855343.61493: when evaluation is False, skipping this task 30582 1726855343.61496: _execute() done 30582 1726855343.61498: dumping result to json 30582 1726855343.61500: done dumping result, returning 30582 1726855343.61502: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-00000000184b] 30582 1726855343.61504: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184b 30582 1726855343.61576: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184b 30582 1726855343.61579: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855343.61642: no more pending results, returning what we have 30582 1726855343.61646: results queue empty 30582 1726855343.61647: checking for any_errors_fatal 30582 1726855343.61657: done checking for any_errors_fatal 30582 1726855343.61658: checking for max_fail_percentage 30582 1726855343.61660: done checking for max_fail_percentage 30582 1726855343.61661: checking to see if all hosts have failed and the running result is not ok 30582 1726855343.61662: done checking to see if all hosts have failed 30582 1726855343.61663: getting the remaining hosts for this loop 30582 1726855343.61664: done getting the remaining hosts for this loop 30582 1726855343.61668: getting the next task for host managed_node3 30582 1726855343.61676: done getting next task for host managed_node3 30582 1726855343.61681: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855343.61689: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855343.61715: getting variables 30582 1726855343.61717: in VariableManager get_vars() 30582 1726855343.61761: Calling all_inventory to load vars for managed_node3 30582 1726855343.61764: Calling groups_inventory to load vars for managed_node3 30582 1726855343.61766: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855343.61779: Calling all_plugins_play to load vars for managed_node3 30582 1726855343.61783: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855343.61786: Calling groups_plugins_play to load vars for managed_node3 30582 1726855343.63728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855343.65378: done with get_vars() 30582 1726855343.65414: done getting variables 30582 1726855343.65492: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:23 -0400 (0:00:00.057) 0:01:20.005 ****** 30582 1726855343.65533: entering _queue_task() for managed_node3/debug 30582 1726855343.66007: worker is 1 (out of 1 available) 30582 1726855343.66024: exiting _queue_task() for managed_node3/debug 30582 1726855343.66035: done queuing things up, now waiting for results queue to drain 30582 1726855343.66036: waiting for pending results... 30582 1726855343.66305: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855343.66428: in run() - task 0affcc66-ac2b-aa83-7d57-00000000184c 30582 1726855343.66491: variable 'ansible_search_path' from source: unknown 30582 1726855343.66498: variable 'ansible_search_path' from source: unknown 30582 1726855343.66505: calling self._execute() 30582 1726855343.66603: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.66618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.66631: variable 'omit' from source: magic vars 30582 1726855343.67024: variable 'ansible_distribution_major_version' from source: facts 30582 1726855343.67092: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855343.67095: variable 'omit' from source: magic vars 30582 1726855343.67129: variable 'omit' from source: magic vars 30582 1726855343.67193: variable 'omit' from source: magic vars 30582 1726855343.67217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855343.67256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855343.67282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855343.67305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855343.67427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855343.67430: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855343.67433: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.67435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.67483: Set connection var ansible_timeout to 10 30582 1726855343.67493: Set connection var ansible_connection to ssh 30582 1726855343.67504: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855343.67512: Set connection var ansible_pipelining to False 30582 1726855343.67520: Set connection var ansible_shell_executable to /bin/sh 30582 1726855343.67526: Set connection var ansible_shell_type to sh 30582 1726855343.67560: variable 'ansible_shell_executable' from source: unknown 30582 1726855343.67567: variable 'ansible_connection' from source: unknown 30582 1726855343.67574: variable 'ansible_module_compression' from source: unknown 30582 1726855343.67580: variable 'ansible_shell_type' from source: unknown 30582 1726855343.67585: variable 'ansible_shell_executable' from source: unknown 30582 1726855343.67594: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.67601: variable 'ansible_pipelining' from source: unknown 30582 1726855343.67607: variable 'ansible_timeout' from source: unknown 30582 1726855343.67614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.67758: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855343.67861: variable 'omit' from source: magic vars 30582 1726855343.67864: starting attempt loop 30582 1726855343.67867: running the handler 30582 1726855343.67933: variable '__network_connections_result' from source: set_fact 30582 1726855343.68000: handler run complete 30582 1726855343.68022: attempt loop complete, returning result 30582 1726855343.68029: _execute() done 30582 1726855343.68035: dumping result to json 30582 1726855343.68042: done dumping result, returning 30582 1726855343.68054: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-00000000184c] 30582 1726855343.68063: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184c ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced" ] } 30582 1726855343.68268: no more pending results, returning what we have 30582 1726855343.68272: results queue empty 30582 1726855343.68274: checking for any_errors_fatal 30582 1726855343.68281: done checking for any_errors_fatal 30582 1726855343.68282: checking for max_fail_percentage 30582 1726855343.68283: done checking for max_fail_percentage 30582 1726855343.68285: checking to see if all hosts have failed and the running result is not ok 30582 1726855343.68286: done checking to see if all hosts have failed 30582 1726855343.68286: getting the remaining hosts for this loop 30582 1726855343.68289: done getting the remaining hosts for this loop 30582 1726855343.68294: getting the next task for host managed_node3 30582 1726855343.68302: done getting next task for host managed_node3 30582 1726855343.68395: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855343.68402: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855343.68500: getting variables 30582 1726855343.68502: in VariableManager get_vars() 30582 1726855343.68551: Calling all_inventory to load vars for managed_node3 30582 1726855343.68553: Calling groups_inventory to load vars for managed_node3 30582 1726855343.68556: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855343.68562: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184c 30582 1726855343.68565: WORKER PROCESS EXITING 30582 1726855343.68574: Calling all_plugins_play to load vars for managed_node3 30582 1726855343.68577: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855343.68580: Calling groups_plugins_play to load vars for managed_node3 30582 1726855343.70095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855343.71716: done with get_vars() 30582 1726855343.71749: done getting variables 30582 1726855343.71817: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:23 -0400 (0:00:00.063) 0:01:20.068 ****** 30582 1726855343.71860: entering _queue_task() for managed_node3/debug 30582 1726855343.72248: worker is 1 (out of 1 available) 30582 1726855343.72374: exiting _queue_task() for managed_node3/debug 30582 1726855343.72384: done queuing things up, now waiting for results queue to drain 30582 1726855343.72386: waiting for pending results... 30582 1726855343.72585: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855343.72814: in run() - task 0affcc66-ac2b-aa83-7d57-00000000184d 30582 1726855343.72818: variable 'ansible_search_path' from source: unknown 30582 1726855343.72822: variable 'ansible_search_path' from source: unknown 30582 1726855343.72833: calling self._execute() 30582 1726855343.72942: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.72953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.72966: variable 'omit' from source: magic vars 30582 1726855343.73377: variable 'ansible_distribution_major_version' from source: facts 30582 1726855343.73466: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855343.73470: variable 'omit' from source: magic vars 30582 1726855343.73493: variable 'omit' from source: magic vars 30582 1726855343.73533: variable 'omit' from source: magic vars 30582 1726855343.73584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855343.73629: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855343.73654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855343.73677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855343.73703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855343.73739: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855343.73749: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.73794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.73881: Set connection var ansible_timeout to 10 30582 1726855343.73891: Set connection var ansible_connection to ssh 30582 1726855343.73913: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855343.73927: Set connection var ansible_pipelining to False 30582 1726855343.73936: Set connection var ansible_shell_executable to /bin/sh 30582 1726855343.74011: Set connection var ansible_shell_type to sh 30582 1726855343.74014: variable 'ansible_shell_executable' from source: unknown 30582 1726855343.74016: variable 'ansible_connection' from source: unknown 30582 1726855343.74018: variable 'ansible_module_compression' from source: unknown 30582 1726855343.74023: variable 'ansible_shell_type' from source: unknown 30582 1726855343.74025: variable 'ansible_shell_executable' from source: unknown 30582 1726855343.74027: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.74029: variable 'ansible_pipelining' from source: unknown 30582 1726855343.74031: variable 'ansible_timeout' from source: unknown 30582 1726855343.74033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.74168: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855343.74186: variable 'omit' from source: magic vars 30582 1726855343.74199: starting attempt loop 30582 1726855343.74205: running the handler 30582 1726855343.74336: variable '__network_connections_result' from source: set_fact 30582 1726855343.74359: variable '__network_connections_result' from source: set_fact 30582 1726855343.74495: handler run complete 30582 1726855343.74528: attempt loop complete, returning result 30582 1726855343.74537: _execute() done 30582 1726855343.74544: dumping result to json 30582 1726855343.74558: done dumping result, returning 30582 1726855343.74571: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-00000000184d] 30582 1726855343.74585: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184d ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced" ] } } 30582 1726855343.74794: no more pending results, returning what we have 30582 1726855343.74799: results queue empty 30582 1726855343.74800: checking for any_errors_fatal 30582 1726855343.74809: done checking for any_errors_fatal 30582 1726855343.74810: checking for max_fail_percentage 30582 1726855343.74812: done checking for max_fail_percentage 30582 1726855343.74813: checking to see if all hosts have failed and the running result is not ok 30582 1726855343.74814: done checking to see if all hosts have failed 30582 1726855343.74815: getting the remaining hosts for this loop 30582 1726855343.74816: done getting the remaining hosts for this loop 30582 1726855343.74820: getting the next task for host managed_node3 30582 1726855343.74828: done getting next task for host managed_node3 30582 1726855343.74832: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855343.74838: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855343.74854: getting variables 30582 1726855343.74856: in VariableManager get_vars() 30582 1726855343.75119: Calling all_inventory to load vars for managed_node3 30582 1726855343.75123: Calling groups_inventory to load vars for managed_node3 30582 1726855343.75131: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184d 30582 1726855343.75142: WORKER PROCESS EXITING 30582 1726855343.75138: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855343.75153: Calling all_plugins_play to load vars for managed_node3 30582 1726855343.75156: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855343.75160: Calling groups_plugins_play to load vars for managed_node3 30582 1726855343.76872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855343.78484: done with get_vars() 30582 1726855343.78516: done getting variables 30582 1726855343.78583: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:23 -0400 (0:00:00.067) 0:01:20.136 ****** 30582 1726855343.78624: entering _queue_task() for managed_node3/debug 30582 1726855343.79210: worker is 1 (out of 1 available) 30582 1726855343.79221: exiting _queue_task() for managed_node3/debug 30582 1726855343.79232: done queuing things up, now waiting for results queue to drain 30582 1726855343.79233: waiting for pending results... 30582 1726855343.79350: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855343.79529: in run() - task 0affcc66-ac2b-aa83-7d57-00000000184e 30582 1726855343.79572: variable 'ansible_search_path' from source: unknown 30582 1726855343.79580: variable 'ansible_search_path' from source: unknown 30582 1726855343.79679: calling self._execute() 30582 1726855343.79718: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.79729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.79744: variable 'omit' from source: magic vars 30582 1726855343.80150: variable 'ansible_distribution_major_version' from source: facts 30582 1726855343.80169: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855343.80308: variable 'network_state' from source: role '' defaults 30582 1726855343.80333: Evaluated conditional (network_state != {}): False 30582 1726855343.80344: when evaluation is False, skipping this task 30582 1726855343.80439: _execute() done 30582 1726855343.80444: dumping result to json 30582 1726855343.80447: done dumping result, returning 30582 1726855343.80450: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-00000000184e] 30582 1726855343.80452: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184e 30582 1726855343.80530: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184e 30582 1726855343.80534: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855343.80590: no more pending results, returning what we have 30582 1726855343.80595: results queue empty 30582 1726855343.80596: checking for any_errors_fatal 30582 1726855343.80607: done checking for any_errors_fatal 30582 1726855343.80608: checking for max_fail_percentage 30582 1726855343.80611: done checking for max_fail_percentage 30582 1726855343.80612: checking to see if all hosts have failed and the running result is not ok 30582 1726855343.80613: done checking to see if all hosts have failed 30582 1726855343.80613: getting the remaining hosts for this loop 30582 1726855343.80615: done getting the remaining hosts for this loop 30582 1726855343.80619: getting the next task for host managed_node3 30582 1726855343.80628: done getting next task for host managed_node3 30582 1726855343.80633: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855343.80639: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855343.80779: getting variables 30582 1726855343.80781: in VariableManager get_vars() 30582 1726855343.80831: Calling all_inventory to load vars for managed_node3 30582 1726855343.80835: Calling groups_inventory to load vars for managed_node3 30582 1726855343.80838: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855343.80851: Calling all_plugins_play to load vars for managed_node3 30582 1726855343.80855: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855343.80859: Calling groups_plugins_play to load vars for managed_node3 30582 1726855343.82425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855343.84276: done with get_vars() 30582 1726855343.84302: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:23 -0400 (0:00:00.057) 0:01:20.193 ****** 30582 1726855343.84409: entering _queue_task() for managed_node3/ping 30582 1726855343.84908: worker is 1 (out of 1 available) 30582 1726855343.84919: exiting _queue_task() for managed_node3/ping 30582 1726855343.84929: done queuing things up, now waiting for results queue to drain 30582 1726855343.84931: waiting for pending results... 30582 1726855343.85173: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855343.85299: in run() - task 0affcc66-ac2b-aa83-7d57-00000000184f 30582 1726855343.85399: variable 'ansible_search_path' from source: unknown 30582 1726855343.85403: variable 'ansible_search_path' from source: unknown 30582 1726855343.85406: calling self._execute() 30582 1726855343.85498: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.85509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.85524: variable 'omit' from source: magic vars 30582 1726855343.85922: variable 'ansible_distribution_major_version' from source: facts 30582 1726855343.85940: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855343.85952: variable 'omit' from source: magic vars 30582 1726855343.86020: variable 'omit' from source: magic vars 30582 1726855343.86193: variable 'omit' from source: magic vars 30582 1726855343.86196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855343.86200: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855343.86202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855343.86205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855343.86207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855343.86238: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855343.86246: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.86254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.86375: Set connection var ansible_timeout to 10 30582 1726855343.86383: Set connection var ansible_connection to ssh 30582 1726855343.86398: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855343.86407: Set connection var ansible_pipelining to False 30582 1726855343.86416: Set connection var ansible_shell_executable to /bin/sh 30582 1726855343.86422: Set connection var ansible_shell_type to sh 30582 1726855343.86454: variable 'ansible_shell_executable' from source: unknown 30582 1726855343.86461: variable 'ansible_connection' from source: unknown 30582 1726855343.86467: variable 'ansible_module_compression' from source: unknown 30582 1726855343.86473: variable 'ansible_shell_type' from source: unknown 30582 1726855343.86478: variable 'ansible_shell_executable' from source: unknown 30582 1726855343.86543: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855343.86546: variable 'ansible_pipelining' from source: unknown 30582 1726855343.86548: variable 'ansible_timeout' from source: unknown 30582 1726855343.86550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855343.86714: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855343.86734: variable 'omit' from source: magic vars 30582 1726855343.86743: starting attempt loop 30582 1726855343.86749: running the handler 30582 1726855343.86775: _low_level_execute_command(): starting 30582 1726855343.86790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855343.87627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.87657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855343.87677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855343.87701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.87799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855343.89518: stdout chunk (state=3): >>>/root <<< 30582 1726855343.89682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855343.89686: stdout chunk (state=3): >>><<< 30582 1726855343.89692: stderr chunk (state=3): >>><<< 30582 1726855343.89819: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855343.89823: _low_level_execute_command(): starting 30582 1726855343.89826: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609 `" && echo ansible-tmp-1726855343.8971868-34303-36069567574609="` echo /root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609 `" ) && sleep 0' 30582 1726855343.90400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855343.90415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855343.90435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.90493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855343.90511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.90590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855343.90641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.90710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855343.92645: stdout chunk (state=3): >>>ansible-tmp-1726855343.8971868-34303-36069567574609=/root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609 <<< 30582 1726855343.93047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855343.93051: stdout chunk (state=3): >>><<< 30582 1726855343.93055: stderr chunk (state=3): >>><<< 30582 1726855343.93076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855343.8971868-34303-36069567574609=/root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855343.93398: variable 'ansible_module_compression' from source: unknown 30582 1726855343.93401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855343.93426: variable 'ansible_facts' from source: unknown 30582 1726855343.93591: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/AnsiballZ_ping.py 30582 1726855343.94027: Sending initial data 30582 1726855343.94030: Sent initial data (152 bytes) 30582 1726855343.94672: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855343.94679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855343.94794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855343.94830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.94893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855343.96507: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855343.96577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855343.96662: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxcsfpy42 /root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/AnsiballZ_ping.py <<< 30582 1726855343.96665: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/AnsiballZ_ping.py" <<< 30582 1726855343.96717: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxcsfpy42" to remote "/root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/AnsiballZ_ping.py" <<< 30582 1726855343.98089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855343.98094: stderr chunk (state=3): >>><<< 30582 1726855343.98096: stdout chunk (state=3): >>><<< 30582 1726855343.98120: done transferring module to remote 30582 1726855343.98131: _low_level_execute_command(): starting 30582 1726855343.98136: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/ /root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/AnsiballZ_ping.py && sleep 0' 30582 1726855343.98750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855343.98753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855343.98773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.98777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855343.98794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855343.98801: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855343.98813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.98831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855343.98834: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855343.98837: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855343.98906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855343.98909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855343.98911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855343.98913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855343.98915: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855343.98917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855343.98960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855343.98973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855343.98995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855343.99081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855344.00856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855344.00893: stderr chunk (state=3): >>><<< 30582 1726855344.00897: stdout chunk (state=3): >>><<< 30582 1726855344.00909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855344.00912: _low_level_execute_command(): starting 30582 1726855344.00918: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/AnsiballZ_ping.py && sleep 0' 30582 1726855344.01362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855344.01365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855344.01371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855344.01373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855344.01375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855344.01427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855344.01436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855344.01439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855344.01501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855344.16466: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855344.17846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855344.17881: stderr chunk (state=3): >>><<< 30582 1726855344.17885: stdout chunk (state=3): >>><<< 30582 1726855344.17908: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855344.18024: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855344.18027: _low_level_execute_command(): starting 30582 1726855344.18030: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855343.8971868-34303-36069567574609/ > /dev/null 2>&1 && sleep 0' 30582 1726855344.18636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855344.18650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855344.18668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855344.18690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855344.18756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855344.18819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855344.18884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855344.18941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855344.21201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855344.21210: stdout chunk (state=3): >>><<< 30582 1726855344.21213: stderr chunk (state=3): >>><<< 30582 1726855344.21215: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855344.21218: handler run complete 30582 1726855344.21220: attempt loop complete, returning result 30582 1726855344.21222: _execute() done 30582 1726855344.21225: dumping result to json 30582 1726855344.21227: done dumping result, returning 30582 1726855344.21229: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-00000000184f] 30582 1726855344.21231: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184f 30582 1726855344.21501: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000184f 30582 1726855344.21505: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855344.21607: no more pending results, returning what we have 30582 1726855344.21612: results queue empty 30582 1726855344.21613: checking for any_errors_fatal 30582 1726855344.21624: done checking for any_errors_fatal 30582 1726855344.21625: checking for max_fail_percentage 30582 1726855344.21636: done checking for max_fail_percentage 30582 1726855344.21637: checking to see if all hosts have failed and the running result is not ok 30582 1726855344.21638: done checking to see if all hosts have failed 30582 1726855344.21641: getting the remaining hosts for this loop 30582 1726855344.21643: done getting the remaining hosts for this loop 30582 1726855344.21650: getting the next task for host managed_node3 30582 1726855344.21663: done getting next task for host managed_node3 30582 1726855344.21668: ^ task is: TASK: meta (role_complete) 30582 1726855344.21674: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855344.21811: getting variables 30582 1726855344.21814: in VariableManager get_vars() 30582 1726855344.21865: Calling all_inventory to load vars for managed_node3 30582 1726855344.21874: Calling groups_inventory to load vars for managed_node3 30582 1726855344.21876: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855344.21890: Calling all_plugins_play to load vars for managed_node3 30582 1726855344.21894: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855344.21897: Calling groups_plugins_play to load vars for managed_node3 30582 1726855344.25940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855344.27255: done with get_vars() 30582 1726855344.27281: done getting variables 30582 1726855344.27345: done queuing things up, now waiting for results queue to drain 30582 1726855344.27346: results queue empty 30582 1726855344.27347: checking for any_errors_fatal 30582 1726855344.27349: done checking for any_errors_fatal 30582 1726855344.27349: checking for max_fail_percentage 30582 1726855344.27350: done checking for max_fail_percentage 30582 1726855344.27351: checking to see if all hosts have failed and the running result is not ok 30582 1726855344.27351: done checking to see if all hosts have failed 30582 1726855344.27352: getting the remaining hosts for this loop 30582 1726855344.27352: done getting the remaining hosts for this loop 30582 1726855344.27355: getting the next task for host managed_node3 30582 1726855344.27358: done getting next task for host managed_node3 30582 1726855344.27359: ^ task is: TASK: Show result 30582 1726855344.27361: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855344.27363: getting variables 30582 1726855344.27364: in VariableManager get_vars() 30582 1726855344.27374: Calling all_inventory to load vars for managed_node3 30582 1726855344.27376: Calling groups_inventory to load vars for managed_node3 30582 1726855344.27377: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855344.27381: Calling all_plugins_play to load vars for managed_node3 30582 1726855344.27382: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855344.27384: Calling groups_plugins_play to load vars for managed_node3 30582 1726855344.28814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855344.31300: done with get_vars() 30582 1726855344.31409: done getting variables 30582 1726855344.31464: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 14:02:24 -0400 (0:00:00.470) 0:01:20.664 ****** 30582 1726855344.31503: entering _queue_task() for managed_node3/debug 30582 1726855344.31995: worker is 1 (out of 1 available) 30582 1726855344.32011: exiting _queue_task() for managed_node3/debug 30582 1726855344.32022: done queuing things up, now waiting for results queue to drain 30582 1726855344.32024: waiting for pending results... 30582 1726855344.32425: running TaskExecutor() for managed_node3/TASK: Show result 30582 1726855344.32439: in run() - task 0affcc66-ac2b-aa83-7d57-0000000017d1 30582 1726855344.32458: variable 'ansible_search_path' from source: unknown 30582 1726855344.32469: variable 'ansible_search_path' from source: unknown 30582 1726855344.32525: calling self._execute() 30582 1726855344.32637: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855344.32648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855344.32665: variable 'omit' from source: magic vars 30582 1726855344.33090: variable 'ansible_distribution_major_version' from source: facts 30582 1726855344.33108: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855344.33121: variable 'omit' from source: magic vars 30582 1726855344.33185: variable 'omit' from source: magic vars 30582 1726855344.33284: variable 'omit' from source: magic vars 30582 1726855344.33288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855344.33317: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855344.33342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855344.33369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855344.33397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855344.33433: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855344.33441: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855344.33449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855344.33572: Set connection var ansible_timeout to 10 30582 1726855344.33605: Set connection var ansible_connection to ssh 30582 1726855344.33610: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855344.33613: Set connection var ansible_pipelining to False 30582 1726855344.33623: Set connection var ansible_shell_executable to /bin/sh 30582 1726855344.33712: Set connection var ansible_shell_type to sh 30582 1726855344.33716: variable 'ansible_shell_executable' from source: unknown 30582 1726855344.33719: variable 'ansible_connection' from source: unknown 30582 1726855344.33721: variable 'ansible_module_compression' from source: unknown 30582 1726855344.33723: variable 'ansible_shell_type' from source: unknown 30582 1726855344.33725: variable 'ansible_shell_executable' from source: unknown 30582 1726855344.33727: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855344.33729: variable 'ansible_pipelining' from source: unknown 30582 1726855344.33731: variable 'ansible_timeout' from source: unknown 30582 1726855344.33733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855344.33863: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855344.33882: variable 'omit' from source: magic vars 30582 1726855344.33929: starting attempt loop 30582 1726855344.33933: running the handler 30582 1726855344.33960: variable '__network_connections_result' from source: set_fact 30582 1726855344.34052: variable '__network_connections_result' from source: set_fact 30582 1726855344.34205: handler run complete 30582 1726855344.34238: attempt loop complete, returning result 30582 1726855344.34257: _execute() done 30582 1726855344.34260: dumping result to json 30582 1726855344.34282: done dumping result, returning 30582 1726855344.34286: done running TaskExecutor() for managed_node3/TASK: Show result [0affcc66-ac2b-aa83-7d57-0000000017d1] 30582 1726855344.34368: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017d1 30582 1726855344.34455: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017d1 30582 1726855344.34459: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced" ] } } 30582 1726855344.34544: no more pending results, returning what we have 30582 1726855344.34548: results queue empty 30582 1726855344.34549: checking for any_errors_fatal 30582 1726855344.34551: done checking for any_errors_fatal 30582 1726855344.34552: checking for max_fail_percentage 30582 1726855344.34554: done checking for max_fail_percentage 30582 1726855344.34555: checking to see if all hosts have failed and the running result is not ok 30582 1726855344.34556: done checking to see if all hosts have failed 30582 1726855344.34557: getting the remaining hosts for this loop 30582 1726855344.34558: done getting the remaining hosts for this loop 30582 1726855344.34562: getting the next task for host managed_node3 30582 1726855344.34579: done getting next task for host managed_node3 30582 1726855344.34582: ^ task is: TASK: Include network role 30582 1726855344.34587: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855344.34595: getting variables 30582 1726855344.34596: in VariableManager get_vars() 30582 1726855344.34659: Calling all_inventory to load vars for managed_node3 30582 1726855344.34663: Calling groups_inventory to load vars for managed_node3 30582 1726855344.34669: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855344.34681: Calling all_plugins_play to load vars for managed_node3 30582 1726855344.34685: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855344.34944: Calling groups_plugins_play to load vars for managed_node3 30582 1726855344.36701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855344.38396: done with get_vars() 30582 1726855344.38428: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 14:02:24 -0400 (0:00:00.070) 0:01:20.735 ****** 30582 1726855344.38539: entering _queue_task() for managed_node3/include_role 30582 1726855344.38960: worker is 1 (out of 1 available) 30582 1726855344.38976: exiting _queue_task() for managed_node3/include_role 30582 1726855344.39132: done queuing things up, now waiting for results queue to drain 30582 1726855344.39135: waiting for pending results... 30582 1726855344.39510: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855344.39515: in run() - task 0affcc66-ac2b-aa83-7d57-0000000017d5 30582 1726855344.39524: variable 'ansible_search_path' from source: unknown 30582 1726855344.39555: variable 'ansible_search_path' from source: unknown 30582 1726855344.40094: calling self._execute() 30582 1726855344.40100: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855344.40103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855344.40105: variable 'omit' from source: magic vars 30582 1726855344.40757: variable 'ansible_distribution_major_version' from source: facts 30582 1726855344.40826: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855344.40839: _execute() done 30582 1726855344.40847: dumping result to json 30582 1726855344.40856: done dumping result, returning 30582 1726855344.40883: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-0000000017d5] 30582 1726855344.40897: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017d5 30582 1726855344.41318: no more pending results, returning what we have 30582 1726855344.41324: in VariableManager get_vars() 30582 1726855344.41376: Calling all_inventory to load vars for managed_node3 30582 1726855344.41379: Calling groups_inventory to load vars for managed_node3 30582 1726855344.41383: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855344.41401: Calling all_plugins_play to load vars for managed_node3 30582 1726855344.41406: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855344.41409: Calling groups_plugins_play to load vars for managed_node3 30582 1726855344.42235: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017d5 30582 1726855344.42239: WORKER PROCESS EXITING 30582 1726855344.44315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855344.46241: done with get_vars() 30582 1726855344.46271: variable 'ansible_search_path' from source: unknown 30582 1726855344.46277: variable 'ansible_search_path' from source: unknown 30582 1726855344.46434: variable 'omit' from source: magic vars 30582 1726855344.46474: variable 'omit' from source: magic vars 30582 1726855344.46495: variable 'omit' from source: magic vars 30582 1726855344.46499: we have included files to process 30582 1726855344.46500: generating all_blocks data 30582 1726855344.46502: done generating all_blocks data 30582 1726855344.46507: processing included file: fedora.linux_system_roles.network 30582 1726855344.46528: in VariableManager get_vars() 30582 1726855344.46544: done with get_vars() 30582 1726855344.46601: in VariableManager get_vars() 30582 1726855344.46620: done with get_vars() 30582 1726855344.46665: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855344.47048: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855344.47190: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855344.48085: in VariableManager get_vars() 30582 1726855344.48123: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855344.50452: iterating over new_blocks loaded from include file 30582 1726855344.50455: in VariableManager get_vars() 30582 1726855344.50477: done with get_vars() 30582 1726855344.50479: filtering new block on tags 30582 1726855344.51175: done filtering new block on tags 30582 1726855344.51179: in VariableManager get_vars() 30582 1726855344.51198: done with get_vars() 30582 1726855344.51200: filtering new block on tags 30582 1726855344.51338: done filtering new block on tags 30582 1726855344.51341: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855344.51346: extending task lists for all hosts with included blocks 30582 1726855344.51638: done extending task lists 30582 1726855344.51639: done processing included files 30582 1726855344.51640: results queue empty 30582 1726855344.51641: checking for any_errors_fatal 30582 1726855344.51766: done checking for any_errors_fatal 30582 1726855344.51768: checking for max_fail_percentage 30582 1726855344.51769: done checking for max_fail_percentage 30582 1726855344.51796: checking to see if all hosts have failed and the running result is not ok 30582 1726855344.51810: done checking to see if all hosts have failed 30582 1726855344.51836: getting the remaining hosts for this loop 30582 1726855344.51837: done getting the remaining hosts for this loop 30582 1726855344.51847: getting the next task for host managed_node3 30582 1726855344.51853: done getting next task for host managed_node3 30582 1726855344.51857: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855344.51860: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855344.51982: getting variables 30582 1726855344.51984: in VariableManager get_vars() 30582 1726855344.52019: Calling all_inventory to load vars for managed_node3 30582 1726855344.52091: Calling groups_inventory to load vars for managed_node3 30582 1726855344.52094: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855344.52101: Calling all_plugins_play to load vars for managed_node3 30582 1726855344.52103: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855344.52106: Calling groups_plugins_play to load vars for managed_node3 30582 1726855344.56624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855344.60177: done with get_vars() 30582 1726855344.60223: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:24 -0400 (0:00:00.217) 0:01:20.953 ****** 30582 1726855344.60334: entering _queue_task() for managed_node3/include_tasks 30582 1726855344.60922: worker is 1 (out of 1 available) 30582 1726855344.60933: exiting _queue_task() for managed_node3/include_tasks 30582 1726855344.60942: done queuing things up, now waiting for results queue to drain 30582 1726855344.60943: waiting for pending results... 30582 1726855344.61235: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855344.61422: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019bf 30582 1726855344.61426: variable 'ansible_search_path' from source: unknown 30582 1726855344.61429: variable 'ansible_search_path' from source: unknown 30582 1726855344.61432: calling self._execute() 30582 1726855344.61508: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855344.61522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855344.61539: variable 'omit' from source: magic vars 30582 1726855344.62081: variable 'ansible_distribution_major_version' from source: facts 30582 1726855344.62124: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855344.62139: _execute() done 30582 1726855344.62148: dumping result to json 30582 1726855344.62161: done dumping result, returning 30582 1726855344.62178: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-0000000019bf] 30582 1726855344.62212: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019bf 30582 1726855344.62433: no more pending results, returning what we have 30582 1726855344.62439: in VariableManager get_vars() 30582 1726855344.62502: Calling all_inventory to load vars for managed_node3 30582 1726855344.62506: Calling groups_inventory to load vars for managed_node3 30582 1726855344.62509: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855344.62524: Calling all_plugins_play to load vars for managed_node3 30582 1726855344.62528: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855344.62531: Calling groups_plugins_play to load vars for managed_node3 30582 1726855344.63224: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019bf 30582 1726855344.63228: WORKER PROCESS EXITING 30582 1726855344.64623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855344.69185: done with get_vars() 30582 1726855344.69224: variable 'ansible_search_path' from source: unknown 30582 1726855344.69227: variable 'ansible_search_path' from source: unknown 30582 1726855344.69358: we have included files to process 30582 1726855344.69360: generating all_blocks data 30582 1726855344.69434: done generating all_blocks data 30582 1726855344.69442: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855344.69444: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855344.69477: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855344.70912: done processing included file 30582 1726855344.70914: iterating over new_blocks loaded from include file 30582 1726855344.70916: in VariableManager get_vars() 30582 1726855344.70944: done with get_vars() 30582 1726855344.70946: filtering new block on tags 30582 1726855344.70982: done filtering new block on tags 30582 1726855344.70985: in VariableManager get_vars() 30582 1726855344.71137: done with get_vars() 30582 1726855344.71139: filtering new block on tags 30582 1726855344.71362: done filtering new block on tags 30582 1726855344.71365: in VariableManager get_vars() 30582 1726855344.71396: done with get_vars() 30582 1726855344.71398: filtering new block on tags 30582 1726855344.71586: done filtering new block on tags 30582 1726855344.71590: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855344.71596: extending task lists for all hosts with included blocks 30582 1726855344.74115: done extending task lists 30582 1726855344.74117: done processing included files 30582 1726855344.74117: results queue empty 30582 1726855344.74118: checking for any_errors_fatal 30582 1726855344.74122: done checking for any_errors_fatal 30582 1726855344.74123: checking for max_fail_percentage 30582 1726855344.74124: done checking for max_fail_percentage 30582 1726855344.74126: checking to see if all hosts have failed and the running result is not ok 30582 1726855344.74127: done checking to see if all hosts have failed 30582 1726855344.74128: getting the remaining hosts for this loop 30582 1726855344.74129: done getting the remaining hosts for this loop 30582 1726855344.74132: getting the next task for host managed_node3 30582 1726855344.74138: done getting next task for host managed_node3 30582 1726855344.74141: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855344.74145: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855344.74162: getting variables 30582 1726855344.74164: in VariableManager get_vars() 30582 1726855344.74181: Calling all_inventory to load vars for managed_node3 30582 1726855344.74184: Calling groups_inventory to load vars for managed_node3 30582 1726855344.74186: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855344.74194: Calling all_plugins_play to load vars for managed_node3 30582 1726855344.74196: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855344.74199: Calling groups_plugins_play to load vars for managed_node3 30582 1726855344.75630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855344.77526: done with get_vars() 30582 1726855344.77562: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:02:24 -0400 (0:00:00.173) 0:01:21.126 ****** 30582 1726855344.77659: entering _queue_task() for managed_node3/setup 30582 1726855344.78634: worker is 1 (out of 1 available) 30582 1726855344.78698: exiting _queue_task() for managed_node3/setup 30582 1726855344.78710: done queuing things up, now waiting for results queue to drain 30582 1726855344.78712: waiting for pending results... 30582 1726855344.79376: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855344.79740: in run() - task 0affcc66-ac2b-aa83-7d57-000000001a16 30582 1726855344.79751: variable 'ansible_search_path' from source: unknown 30582 1726855344.79872: variable 'ansible_search_path' from source: unknown 30582 1726855344.80071: calling self._execute() 30582 1726855344.80172: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855344.80214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855344.80230: variable 'omit' from source: magic vars 30582 1726855344.80953: variable 'ansible_distribution_major_version' from source: facts 30582 1726855344.80968: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855344.81271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855344.83594: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855344.83729: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855344.83734: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855344.83742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855344.83773: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855344.83858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855344.83894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855344.83924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855344.83964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855344.83977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855344.84095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855344.84098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855344.84100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855344.84117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855344.84135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855344.84419: variable '__network_required_facts' from source: role '' defaults 30582 1726855344.84423: variable 'ansible_facts' from source: unknown 30582 1726855344.85110: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855344.85115: when evaluation is False, skipping this task 30582 1726855344.85117: _execute() done 30582 1726855344.85120: dumping result to json 30582 1726855344.85127: done dumping result, returning 30582 1726855344.85136: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-000000001a16] 30582 1726855344.85145: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a16 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855344.85632: no more pending results, returning what we have 30582 1726855344.85637: results queue empty 30582 1726855344.85638: checking for any_errors_fatal 30582 1726855344.85640: done checking for any_errors_fatal 30582 1726855344.85641: checking for max_fail_percentage 30582 1726855344.85643: done checking for max_fail_percentage 30582 1726855344.85644: checking to see if all hosts have failed and the running result is not ok 30582 1726855344.85644: done checking to see if all hosts have failed 30582 1726855344.85645: getting the remaining hosts for this loop 30582 1726855344.85646: done getting the remaining hosts for this loop 30582 1726855344.85650: getting the next task for host managed_node3 30582 1726855344.85661: done getting next task for host managed_node3 30582 1726855344.85665: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855344.85674: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855344.85698: getting variables 30582 1726855344.85700: in VariableManager get_vars() 30582 1726855344.85738: Calling all_inventory to load vars for managed_node3 30582 1726855344.85741: Calling groups_inventory to load vars for managed_node3 30582 1726855344.85743: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855344.85751: Calling all_plugins_play to load vars for managed_node3 30582 1726855344.85754: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855344.85757: Calling groups_plugins_play to load vars for managed_node3 30582 1726855344.86421: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a16 30582 1726855344.86960: WORKER PROCESS EXITING 30582 1726855344.87418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855344.95304: done with get_vars() 30582 1726855344.95335: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:02:24 -0400 (0:00:00.177) 0:01:21.304 ****** 30582 1726855344.95429: entering _queue_task() for managed_node3/stat 30582 1726855344.95851: worker is 1 (out of 1 available) 30582 1726855344.95865: exiting _queue_task() for managed_node3/stat 30582 1726855344.95876: done queuing things up, now waiting for results queue to drain 30582 1726855344.95878: waiting for pending results... 30582 1726855344.96149: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855344.96385: in run() - task 0affcc66-ac2b-aa83-7d57-000000001a18 30582 1726855344.96392: variable 'ansible_search_path' from source: unknown 30582 1726855344.96397: variable 'ansible_search_path' from source: unknown 30582 1726855344.96401: calling self._execute() 30582 1726855344.96493: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855344.96593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855344.96599: variable 'omit' from source: magic vars 30582 1726855344.96901: variable 'ansible_distribution_major_version' from source: facts 30582 1726855344.96913: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855344.97177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855344.97509: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855344.97554: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855344.97652: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855344.97692: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855344.97774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855344.97800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855344.97825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855344.97850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855344.97948: variable '__network_is_ostree' from source: set_fact 30582 1726855344.97956: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855344.97959: when evaluation is False, skipping this task 30582 1726855344.97961: _execute() done 30582 1726855344.97964: dumping result to json 30582 1726855344.97966: done dumping result, returning 30582 1726855344.97979: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-000000001a18] 30582 1726855344.97985: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a18 30582 1726855344.98091: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a18 30582 1726855344.98094: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855344.98157: no more pending results, returning what we have 30582 1726855344.98161: results queue empty 30582 1726855344.98163: checking for any_errors_fatal 30582 1726855344.98175: done checking for any_errors_fatal 30582 1726855344.98176: checking for max_fail_percentage 30582 1726855344.98179: done checking for max_fail_percentage 30582 1726855344.98180: checking to see if all hosts have failed and the running result is not ok 30582 1726855344.98181: done checking to see if all hosts have failed 30582 1726855344.98181: getting the remaining hosts for this loop 30582 1726855344.98183: done getting the remaining hosts for this loop 30582 1726855344.98189: getting the next task for host managed_node3 30582 1726855344.98198: done getting next task for host managed_node3 30582 1726855344.98203: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855344.98210: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855344.98236: getting variables 30582 1726855344.98238: in VariableManager get_vars() 30582 1726855344.98286: Calling all_inventory to load vars for managed_node3 30582 1726855344.98293: Calling groups_inventory to load vars for managed_node3 30582 1726855344.98296: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855344.98308: Calling all_plugins_play to load vars for managed_node3 30582 1726855344.98312: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855344.98315: Calling groups_plugins_play to load vars for managed_node3 30582 1726855345.00005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855345.01655: done with get_vars() 30582 1726855345.01693: done getting variables 30582 1726855345.01761: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:02:25 -0400 (0:00:00.063) 0:01:21.367 ****** 30582 1726855345.01810: entering _queue_task() for managed_node3/set_fact 30582 1726855345.02323: worker is 1 (out of 1 available) 30582 1726855345.02335: exiting _queue_task() for managed_node3/set_fact 30582 1726855345.02347: done queuing things up, now waiting for results queue to drain 30582 1726855345.02348: waiting for pending results... 30582 1726855345.02661: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855345.02758: in run() - task 0affcc66-ac2b-aa83-7d57-000000001a19 30582 1726855345.02772: variable 'ansible_search_path' from source: unknown 30582 1726855345.02775: variable 'ansible_search_path' from source: unknown 30582 1726855345.02818: calling self._execute() 30582 1726855345.02921: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855345.02926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855345.02937: variable 'omit' from source: magic vars 30582 1726855345.03358: variable 'ansible_distribution_major_version' from source: facts 30582 1726855345.03373: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855345.03545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855345.03845: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855345.03895: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855345.03981: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855345.04019: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855345.04113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855345.04161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855345.04165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855345.04198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855345.04301: variable '__network_is_ostree' from source: set_fact 30582 1726855345.04378: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855345.04381: when evaluation is False, skipping this task 30582 1726855345.04383: _execute() done 30582 1726855345.04386: dumping result to json 30582 1726855345.04389: done dumping result, returning 30582 1726855345.04392: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-000000001a19] 30582 1726855345.04394: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a19 30582 1726855345.04458: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a19 30582 1726855345.04461: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855345.04530: no more pending results, returning what we have 30582 1726855345.04534: results queue empty 30582 1726855345.04535: checking for any_errors_fatal 30582 1726855345.04542: done checking for any_errors_fatal 30582 1726855345.04542: checking for max_fail_percentage 30582 1726855345.04544: done checking for max_fail_percentage 30582 1726855345.04546: checking to see if all hosts have failed and the running result is not ok 30582 1726855345.04546: done checking to see if all hosts have failed 30582 1726855345.04547: getting the remaining hosts for this loop 30582 1726855345.04549: done getting the remaining hosts for this loop 30582 1726855345.04553: getting the next task for host managed_node3 30582 1726855345.04565: done getting next task for host managed_node3 30582 1726855345.04572: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855345.04578: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855345.04713: getting variables 30582 1726855345.04715: in VariableManager get_vars() 30582 1726855345.04759: Calling all_inventory to load vars for managed_node3 30582 1726855345.04762: Calling groups_inventory to load vars for managed_node3 30582 1726855345.04764: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855345.04778: Calling all_plugins_play to load vars for managed_node3 30582 1726855345.04782: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855345.04786: Calling groups_plugins_play to load vars for managed_node3 30582 1726855345.06576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855345.08229: done with get_vars() 30582 1726855345.08258: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:02:25 -0400 (0:00:00.065) 0:01:21.433 ****** 30582 1726855345.08370: entering _queue_task() for managed_node3/service_facts 30582 1726855345.08770: worker is 1 (out of 1 available) 30582 1726855345.08783: exiting _queue_task() for managed_node3/service_facts 30582 1726855345.08900: done queuing things up, now waiting for results queue to drain 30582 1726855345.08903: waiting for pending results... 30582 1726855345.09160: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855345.09361: in run() - task 0affcc66-ac2b-aa83-7d57-000000001a1b 30582 1726855345.09365: variable 'ansible_search_path' from source: unknown 30582 1726855345.09368: variable 'ansible_search_path' from source: unknown 30582 1726855345.09371: calling self._execute() 30582 1726855345.09465: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855345.09473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855345.09486: variable 'omit' from source: magic vars 30582 1726855345.09894: variable 'ansible_distribution_major_version' from source: facts 30582 1726855345.09920: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855345.09924: variable 'omit' from source: magic vars 30582 1726855345.10001: variable 'omit' from source: magic vars 30582 1726855345.10192: variable 'omit' from source: magic vars 30582 1726855345.10196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855345.10199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855345.10202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855345.10205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855345.10208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855345.10211: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855345.10214: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855345.10217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855345.10336: Set connection var ansible_timeout to 10 30582 1726855345.10339: Set connection var ansible_connection to ssh 30582 1726855345.10347: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855345.10352: Set connection var ansible_pipelining to False 30582 1726855345.10357: Set connection var ansible_shell_executable to /bin/sh 30582 1726855345.10359: Set connection var ansible_shell_type to sh 30582 1726855345.10386: variable 'ansible_shell_executable' from source: unknown 30582 1726855345.10391: variable 'ansible_connection' from source: unknown 30582 1726855345.10394: variable 'ansible_module_compression' from source: unknown 30582 1726855345.10396: variable 'ansible_shell_type' from source: unknown 30582 1726855345.10398: variable 'ansible_shell_executable' from source: unknown 30582 1726855345.10401: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855345.10405: variable 'ansible_pipelining' from source: unknown 30582 1726855345.10408: variable 'ansible_timeout' from source: unknown 30582 1726855345.10412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855345.10628: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855345.10645: variable 'omit' from source: magic vars 30582 1726855345.10650: starting attempt loop 30582 1726855345.10658: running the handler 30582 1726855345.10769: _low_level_execute_command(): starting 30582 1726855345.10773: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855345.11368: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855345.11391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855345.11409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855345.11434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855345.11453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855345.11510: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855345.11570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855345.11595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855345.11610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855345.11717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855345.13426: stdout chunk (state=3): >>>/root <<< 30582 1726855345.13584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855345.13590: stdout chunk (state=3): >>><<< 30582 1726855345.13593: stderr chunk (state=3): >>><<< 30582 1726855345.13624: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855345.13728: _low_level_execute_command(): starting 30582 1726855345.13732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868 `" && echo ansible-tmp-1726855345.1363301-34348-13365275194868="` echo /root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868 `" ) && sleep 0' 30582 1726855345.14264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855345.14280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855345.14298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855345.14317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855345.14344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855345.14356: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855345.14369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855345.14398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855345.14413: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855345.14499: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855345.14521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855345.14542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855345.14637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855345.16583: stdout chunk (state=3): >>>ansible-tmp-1726855345.1363301-34348-13365275194868=/root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868 <<< 30582 1726855345.16730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855345.16752: stdout chunk (state=3): >>><<< 30582 1726855345.16771: stderr chunk (state=3): >>><<< 30582 1726855345.16994: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855345.1363301-34348-13365275194868=/root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855345.16998: variable 'ansible_module_compression' from source: unknown 30582 1726855345.17000: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855345.17002: variable 'ansible_facts' from source: unknown 30582 1726855345.17033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/AnsiballZ_service_facts.py 30582 1726855345.17257: Sending initial data 30582 1726855345.17261: Sent initial data (161 bytes) 30582 1726855345.18022: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855345.18124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855345.18157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855345.18176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855345.18202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855345.18308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855345.19984: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855345.20093: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855345.20193: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpvna5o0vd /root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/AnsiballZ_service_facts.py <<< 30582 1726855345.20199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/AnsiballZ_service_facts.py" <<< 30582 1726855345.20344: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpvna5o0vd" to remote "/root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/AnsiballZ_service_facts.py" <<< 30582 1726855345.22054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855345.22058: stdout chunk (state=3): >>><<< 30582 1726855345.22060: stderr chunk (state=3): >>><<< 30582 1726855345.22063: done transferring module to remote 30582 1726855345.22065: _low_level_execute_command(): starting 30582 1726855345.22067: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/ /root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/AnsiballZ_service_facts.py && sleep 0' 30582 1726855345.23225: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855345.23311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855345.23420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855345.23500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855345.25402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855345.25431: stdout chunk (state=3): >>><<< 30582 1726855345.25444: stderr chunk (state=3): >>><<< 30582 1726855345.25466: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855345.25475: _low_level_execute_command(): starting 30582 1726855345.25566: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/AnsiballZ_service_facts.py && sleep 0' 30582 1726855345.26324: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855345.26327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855345.26330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855345.26332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855345.26335: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855345.26337: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855345.26339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855345.26341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855345.26343: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855345.26345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855345.26349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855345.26352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855345.26354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855345.26356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855345.26358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855345.26425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855346.77458: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30582 1726855346.77529: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 30582 1726855346.77552: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855346.79281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855346.79286: stdout chunk (state=3): >>><<< 30582 1726855346.79298: stderr chunk (state=3): >>><<< 30582 1726855346.79326: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855346.80678: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855346.80686: _low_level_execute_command(): starting 30582 1726855346.80847: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855345.1363301-34348-13365275194868/ > /dev/null 2>&1 && sleep 0' 30582 1726855346.81506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855346.81534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855346.81551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855346.81568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855346.81646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855346.81685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855346.81706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855346.81722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855346.81816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855346.83735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855346.83738: stdout chunk (state=3): >>><<< 30582 1726855346.83740: stderr chunk (state=3): >>><<< 30582 1726855346.83864: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855346.83869: handler run complete 30582 1726855346.84016: variable 'ansible_facts' from source: unknown 30582 1726855346.84186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855346.84725: variable 'ansible_facts' from source: unknown 30582 1726855346.84880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855346.85179: attempt loop complete, returning result 30582 1726855346.85183: _execute() done 30582 1726855346.85186: dumping result to json 30582 1726855346.85190: done dumping result, returning 30582 1726855346.85195: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000001a1b] 30582 1726855346.85205: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a1b 30582 1726855346.86563: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a1b 30582 1726855346.86567: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855346.86698: no more pending results, returning what we have 30582 1726855346.86701: results queue empty 30582 1726855346.86702: checking for any_errors_fatal 30582 1726855346.86709: done checking for any_errors_fatal 30582 1726855346.86710: checking for max_fail_percentage 30582 1726855346.86712: done checking for max_fail_percentage 30582 1726855346.86713: checking to see if all hosts have failed and the running result is not ok 30582 1726855346.86713: done checking to see if all hosts have failed 30582 1726855346.86714: getting the remaining hosts for this loop 30582 1726855346.86715: done getting the remaining hosts for this loop 30582 1726855346.86719: getting the next task for host managed_node3 30582 1726855346.86727: done getting next task for host managed_node3 30582 1726855346.86731: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855346.86738: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855346.86752: getting variables 30582 1726855346.86754: in VariableManager get_vars() 30582 1726855346.87028: Calling all_inventory to load vars for managed_node3 30582 1726855346.87032: Calling groups_inventory to load vars for managed_node3 30582 1726855346.87034: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855346.87045: Calling all_plugins_play to load vars for managed_node3 30582 1726855346.87048: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855346.87058: Calling groups_plugins_play to load vars for managed_node3 30582 1726855346.90239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855346.93781: done with get_vars() 30582 1726855346.93820: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:02:26 -0400 (0:00:01.856) 0:01:23.290 ****** 30582 1726855346.94052: entering _queue_task() for managed_node3/package_facts 30582 1726855346.94844: worker is 1 (out of 1 available) 30582 1726855346.94974: exiting _queue_task() for managed_node3/package_facts 30582 1726855346.94986: done queuing things up, now waiting for results queue to drain 30582 1726855346.94990: waiting for pending results... 30582 1726855346.95556: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855346.96096: in run() - task 0affcc66-ac2b-aa83-7d57-000000001a1c 30582 1726855346.96100: variable 'ansible_search_path' from source: unknown 30582 1726855346.96103: variable 'ansible_search_path' from source: unknown 30582 1726855346.96106: calling self._execute() 30582 1726855346.96108: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855346.96111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855346.96114: variable 'omit' from source: magic vars 30582 1726855346.96463: variable 'ansible_distribution_major_version' from source: facts 30582 1726855346.96479: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855346.96506: variable 'omit' from source: magic vars 30582 1726855346.96598: variable 'omit' from source: magic vars 30582 1726855346.96635: variable 'omit' from source: magic vars 30582 1726855346.96680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855346.96724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855346.96749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855346.96767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855346.96783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855346.96814: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855346.96818: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855346.96820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855346.97051: Set connection var ansible_timeout to 10 30582 1726855346.97056: Set connection var ansible_connection to ssh 30582 1726855346.97058: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855346.97061: Set connection var ansible_pipelining to False 30582 1726855346.97063: Set connection var ansible_shell_executable to /bin/sh 30582 1726855346.97064: Set connection var ansible_shell_type to sh 30582 1726855346.97066: variable 'ansible_shell_executable' from source: unknown 30582 1726855346.97069: variable 'ansible_connection' from source: unknown 30582 1726855346.97071: variable 'ansible_module_compression' from source: unknown 30582 1726855346.97073: variable 'ansible_shell_type' from source: unknown 30582 1726855346.97075: variable 'ansible_shell_executable' from source: unknown 30582 1726855346.97077: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855346.97079: variable 'ansible_pipelining' from source: unknown 30582 1726855346.97081: variable 'ansible_timeout' from source: unknown 30582 1726855346.97083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855346.97380: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855346.97386: variable 'omit' from source: magic vars 30582 1726855346.97390: starting attempt loop 30582 1726855346.97392: running the handler 30582 1726855346.97395: _low_level_execute_command(): starting 30582 1726855346.97397: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855346.98003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855346.98020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855346.98029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855346.98132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855346.98144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855346.98277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855346.98796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855347.00407: stdout chunk (state=3): >>>/root <<< 30582 1726855347.00529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855347.00555: stderr chunk (state=3): >>><<< 30582 1726855347.00559: stdout chunk (state=3): >>><<< 30582 1726855347.00711: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855347.00746: _low_level_execute_command(): starting 30582 1726855347.00750: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940 `" && echo ansible-tmp-1726855347.0071135-34437-6518412144940="` echo /root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940 `" ) && sleep 0' 30582 1726855347.01965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855347.01968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855347.01971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855347.01974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855347.02315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855347.02330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855347.02461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855347.04345: stdout chunk (state=3): >>>ansible-tmp-1726855347.0071135-34437-6518412144940=/root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940 <<< 30582 1726855347.04403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855347.04438: stderr chunk (state=3): >>><<< 30582 1726855347.04441: stdout chunk (state=3): >>><<< 30582 1726855347.04463: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855347.0071135-34437-6518412144940=/root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855347.04518: variable 'ansible_module_compression' from source: unknown 30582 1726855347.04568: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855347.04736: variable 'ansible_facts' from source: unknown 30582 1726855347.05128: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/AnsiballZ_package_facts.py 30582 1726855347.05426: Sending initial data 30582 1726855347.05429: Sent initial data (160 bytes) 30582 1726855347.07215: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855347.07517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855347.07624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855347.09154: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855347.09320: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855347.09363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_wdb68sa /root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/AnsiballZ_package_facts.py <<< 30582 1726855347.09367: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/AnsiballZ_package_facts.py" <<< 30582 1726855347.09429: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_wdb68sa" to remote "/root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/AnsiballZ_package_facts.py" <<< 30582 1726855347.13633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855347.13637: stdout chunk (state=3): >>><<< 30582 1726855347.13640: stderr chunk (state=3): >>><<< 30582 1726855347.13642: done transferring module to remote 30582 1726855347.13644: _low_level_execute_command(): starting 30582 1726855347.13646: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/ /root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/AnsiballZ_package_facts.py && sleep 0' 30582 1726855347.14995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855347.15205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855347.15304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855347.17482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855347.17486: stdout chunk (state=3): >>><<< 30582 1726855347.17490: stderr chunk (state=3): >>><<< 30582 1726855347.17494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855347.17500: _low_level_execute_command(): starting 30582 1726855347.17503: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/AnsiballZ_package_facts.py && sleep 0' 30582 1726855347.18900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855347.18990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855347.19005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855347.19019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855347.19032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855347.19128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855347.63585: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30582 1726855347.63776: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30582 1726855347.63791: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30582 1726855347.63806: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855347.65609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855347.65613: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855347.65615: stdout chunk (state=3): >>><<< 30582 1726855347.65618: stderr chunk (state=3): >>><<< 30582 1726855347.65804: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855347.68042: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855347.68094: _low_level_execute_command(): starting 30582 1726855347.68098: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855347.0071135-34437-6518412144940/ > /dev/null 2>&1 && sleep 0' 30582 1726855347.68898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855347.68919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855347.68944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855347.69043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855347.70958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855347.70962: stdout chunk (state=3): >>><<< 30582 1726855347.70965: stderr chunk (state=3): >>><<< 30582 1726855347.71194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855347.71197: handler run complete 30582 1726855347.71903: variable 'ansible_facts' from source: unknown 30582 1726855347.72416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855347.74395: variable 'ansible_facts' from source: unknown 30582 1726855347.74858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855347.75578: attempt loop complete, returning result 30582 1726855347.75597: _execute() done 30582 1726855347.75603: dumping result to json 30582 1726855347.75816: done dumping result, returning 30582 1726855347.75831: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000001a1c] 30582 1726855347.75841: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a1c 30582 1726855347.78369: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001a1c 30582 1726855347.78372: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855347.78547: no more pending results, returning what we have 30582 1726855347.78550: results queue empty 30582 1726855347.78551: checking for any_errors_fatal 30582 1726855347.78557: done checking for any_errors_fatal 30582 1726855347.78558: checking for max_fail_percentage 30582 1726855347.78560: done checking for max_fail_percentage 30582 1726855347.78561: checking to see if all hosts have failed and the running result is not ok 30582 1726855347.78562: done checking to see if all hosts have failed 30582 1726855347.78562: getting the remaining hosts for this loop 30582 1726855347.78563: done getting the remaining hosts for this loop 30582 1726855347.78567: getting the next task for host managed_node3 30582 1726855347.78574: done getting next task for host managed_node3 30582 1726855347.78578: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855347.78583: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855347.78598: getting variables 30582 1726855347.78599: in VariableManager get_vars() 30582 1726855347.78631: Calling all_inventory to load vars for managed_node3 30582 1726855347.78633: Calling groups_inventory to load vars for managed_node3 30582 1726855347.78644: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855347.78654: Calling all_plugins_play to load vars for managed_node3 30582 1726855347.78656: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855347.78659: Calling groups_plugins_play to load vars for managed_node3 30582 1726855347.80007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855347.81762: done with get_vars() 30582 1726855347.81791: done getting variables 30582 1726855347.81866: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:27 -0400 (0:00:00.878) 0:01:24.169 ****** 30582 1726855347.81922: entering _queue_task() for managed_node3/debug 30582 1726855347.82327: worker is 1 (out of 1 available) 30582 1726855347.82589: exiting _queue_task() for managed_node3/debug 30582 1726855347.82599: done queuing things up, now waiting for results queue to drain 30582 1726855347.82601: waiting for pending results... 30582 1726855347.82731: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855347.82818: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c0 30582 1726855347.82853: variable 'ansible_search_path' from source: unknown 30582 1726855347.82861: variable 'ansible_search_path' from source: unknown 30582 1726855347.82906: calling self._execute() 30582 1726855347.83013: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855347.83046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855347.83049: variable 'omit' from source: magic vars 30582 1726855347.83450: variable 'ansible_distribution_major_version' from source: facts 30582 1726855347.83467: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855347.83494: variable 'omit' from source: magic vars 30582 1726855347.83593: variable 'omit' from source: magic vars 30582 1726855347.83667: variable 'network_provider' from source: set_fact 30582 1726855347.83699: variable 'omit' from source: magic vars 30582 1726855347.83744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855347.83781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855347.83913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855347.83917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855347.83921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855347.83924: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855347.83926: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855347.83928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855347.84129: Set connection var ansible_timeout to 10 30582 1726855347.84133: Set connection var ansible_connection to ssh 30582 1726855347.84135: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855347.84137: Set connection var ansible_pipelining to False 30582 1726855347.84139: Set connection var ansible_shell_executable to /bin/sh 30582 1726855347.84141: Set connection var ansible_shell_type to sh 30582 1726855347.84143: variable 'ansible_shell_executable' from source: unknown 30582 1726855347.84152: variable 'ansible_connection' from source: unknown 30582 1726855347.84157: variable 'ansible_module_compression' from source: unknown 30582 1726855347.84160: variable 'ansible_shell_type' from source: unknown 30582 1726855347.84162: variable 'ansible_shell_executable' from source: unknown 30582 1726855347.84165: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855347.84167: variable 'ansible_pipelining' from source: unknown 30582 1726855347.84169: variable 'ansible_timeout' from source: unknown 30582 1726855347.84171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855347.84316: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855347.84333: variable 'omit' from source: magic vars 30582 1726855347.84349: starting attempt loop 30582 1726855347.84357: running the handler 30582 1726855347.84420: handler run complete 30582 1726855347.84454: attempt loop complete, returning result 30582 1726855347.84457: _execute() done 30582 1726855347.84460: dumping result to json 30582 1726855347.84462: done dumping result, returning 30582 1726855347.84563: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-0000000019c0] 30582 1726855347.84566: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c0 ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855347.84744: no more pending results, returning what we have 30582 1726855347.84749: results queue empty 30582 1726855347.84751: checking for any_errors_fatal 30582 1726855347.84762: done checking for any_errors_fatal 30582 1726855347.84762: checking for max_fail_percentage 30582 1726855347.84765: done checking for max_fail_percentage 30582 1726855347.84766: checking to see if all hosts have failed and the running result is not ok 30582 1726855347.84767: done checking to see if all hosts have failed 30582 1726855347.84768: getting the remaining hosts for this loop 30582 1726855347.84769: done getting the remaining hosts for this loop 30582 1726855347.84893: getting the next task for host managed_node3 30582 1726855347.84903: done getting next task for host managed_node3 30582 1726855347.84908: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855347.84913: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855347.84929: getting variables 30582 1726855347.84931: in VariableManager get_vars() 30582 1726855347.84975: Calling all_inventory to load vars for managed_node3 30582 1726855347.84979: Calling groups_inventory to load vars for managed_node3 30582 1726855347.84981: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855347.85107: Calling all_plugins_play to load vars for managed_node3 30582 1726855347.85112: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855347.85120: Calling groups_plugins_play to load vars for managed_node3 30582 1726855347.85721: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c0 30582 1726855347.85724: WORKER PROCESS EXITING 30582 1726855347.86652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855347.88326: done with get_vars() 30582 1726855347.88362: done getting variables 30582 1726855347.88423: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:27 -0400 (0:00:00.065) 0:01:24.234 ****** 30582 1726855347.88473: entering _queue_task() for managed_node3/fail 30582 1726855347.88858: worker is 1 (out of 1 available) 30582 1726855347.88873: exiting _queue_task() for managed_node3/fail 30582 1726855347.88885: done queuing things up, now waiting for results queue to drain 30582 1726855347.89017: waiting for pending results... 30582 1726855347.89243: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855347.89429: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c1 30582 1726855347.89451: variable 'ansible_search_path' from source: unknown 30582 1726855347.89461: variable 'ansible_search_path' from source: unknown 30582 1726855347.89515: calling self._execute() 30582 1726855347.89622: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855347.89634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855347.89694: variable 'omit' from source: magic vars 30582 1726855347.90070: variable 'ansible_distribution_major_version' from source: facts 30582 1726855347.90089: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855347.90237: variable 'network_state' from source: role '' defaults 30582 1726855347.90255: Evaluated conditional (network_state != {}): False 30582 1726855347.90262: when evaluation is False, skipping this task 30582 1726855347.90268: _execute() done 30582 1726855347.90273: dumping result to json 30582 1726855347.90342: done dumping result, returning 30582 1726855347.90345: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-0000000019c1] 30582 1726855347.90348: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c1 30582 1726855347.90427: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c1 30582 1726855347.90431: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855347.90500: no more pending results, returning what we have 30582 1726855347.90505: results queue empty 30582 1726855347.90506: checking for any_errors_fatal 30582 1726855347.90517: done checking for any_errors_fatal 30582 1726855347.90518: checking for max_fail_percentage 30582 1726855347.90520: done checking for max_fail_percentage 30582 1726855347.90522: checking to see if all hosts have failed and the running result is not ok 30582 1726855347.90522: done checking to see if all hosts have failed 30582 1726855347.90523: getting the remaining hosts for this loop 30582 1726855347.90525: done getting the remaining hosts for this loop 30582 1726855347.90529: getting the next task for host managed_node3 30582 1726855347.90537: done getting next task for host managed_node3 30582 1726855347.90542: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855347.90548: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855347.90697: getting variables 30582 1726855347.90700: in VariableManager get_vars() 30582 1726855347.90747: Calling all_inventory to load vars for managed_node3 30582 1726855347.90750: Calling groups_inventory to load vars for managed_node3 30582 1726855347.90752: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855347.90764: Calling all_plugins_play to load vars for managed_node3 30582 1726855347.90767: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855347.90771: Calling groups_plugins_play to load vars for managed_node3 30582 1726855347.92652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855347.94267: done with get_vars() 30582 1726855347.94308: done getting variables 30582 1726855347.94375: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:27 -0400 (0:00:00.059) 0:01:24.294 ****** 30582 1726855347.94427: entering _queue_task() for managed_node3/fail 30582 1726855347.94900: worker is 1 (out of 1 available) 30582 1726855347.94914: exiting _queue_task() for managed_node3/fail 30582 1726855347.94925: done queuing things up, now waiting for results queue to drain 30582 1726855347.94926: waiting for pending results... 30582 1726855347.95281: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855347.95336: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c2 30582 1726855347.95354: variable 'ansible_search_path' from source: unknown 30582 1726855347.95360: variable 'ansible_search_path' from source: unknown 30582 1726855347.95410: calling self._execute() 30582 1726855347.95523: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855347.95535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855347.95598: variable 'omit' from source: magic vars 30582 1726855347.95954: variable 'ansible_distribution_major_version' from source: facts 30582 1726855347.95973: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855347.96108: variable 'network_state' from source: role '' defaults 30582 1726855347.96125: Evaluated conditional (network_state != {}): False 30582 1726855347.96142: when evaluation is False, skipping this task 30582 1726855347.96152: _execute() done 30582 1726855347.96248: dumping result to json 30582 1726855347.96251: done dumping result, returning 30582 1726855347.96254: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-0000000019c2] 30582 1726855347.96257: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c2 30582 1726855347.96328: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c2 30582 1726855347.96332: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855347.96409: no more pending results, returning what we have 30582 1726855347.96414: results queue empty 30582 1726855347.96416: checking for any_errors_fatal 30582 1726855347.96425: done checking for any_errors_fatal 30582 1726855347.96426: checking for max_fail_percentage 30582 1726855347.96429: done checking for max_fail_percentage 30582 1726855347.96430: checking to see if all hosts have failed and the running result is not ok 30582 1726855347.96431: done checking to see if all hosts have failed 30582 1726855347.96432: getting the remaining hosts for this loop 30582 1726855347.96434: done getting the remaining hosts for this loop 30582 1726855347.96439: getting the next task for host managed_node3 30582 1726855347.96450: done getting next task for host managed_node3 30582 1726855347.96455: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855347.96461: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855347.96495: getting variables 30582 1726855347.96497: in VariableManager get_vars() 30582 1726855347.96551: Calling all_inventory to load vars for managed_node3 30582 1726855347.96554: Calling groups_inventory to load vars for managed_node3 30582 1726855347.96556: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855347.96570: Calling all_plugins_play to load vars for managed_node3 30582 1726855347.96573: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855347.96577: Calling groups_plugins_play to load vars for managed_node3 30582 1726855347.98312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855348.00012: done with get_vars() 30582 1726855348.00043: done getting variables 30582 1726855348.00117: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:28 -0400 (0:00:00.057) 0:01:24.351 ****** 30582 1726855348.00156: entering _queue_task() for managed_node3/fail 30582 1726855348.00553: worker is 1 (out of 1 available) 30582 1726855348.00565: exiting _queue_task() for managed_node3/fail 30582 1726855348.00577: done queuing things up, now waiting for results queue to drain 30582 1726855348.00579: waiting for pending results... 30582 1726855348.01011: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855348.01066: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c3 30582 1726855348.01103: variable 'ansible_search_path' from source: unknown 30582 1726855348.01107: variable 'ansible_search_path' from source: unknown 30582 1726855348.01212: calling self._execute() 30582 1726855348.01259: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855348.01271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855348.01284: variable 'omit' from source: magic vars 30582 1726855348.01725: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.01743: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855348.01948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855348.04841: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855348.04925: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855348.05011: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855348.05016: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855348.05056: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855348.05165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.05195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.05240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.05338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.05343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.05423: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.05447: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855348.05570: variable 'ansible_distribution' from source: facts 30582 1726855348.05580: variable '__network_rh_distros' from source: role '' defaults 30582 1726855348.05664: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855348.05876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.05938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.05966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.06022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.06093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.06097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.06208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.06213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.06216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.06231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.06282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.06309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.06340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.06443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.06446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.06752: variable 'network_connections' from source: include params 30582 1726855348.06777: variable 'interface' from source: play vars 30582 1726855348.06881: variable 'interface' from source: play vars 30582 1726855348.06884: variable 'network_state' from source: role '' defaults 30582 1726855348.06948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855348.07146: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855348.07192: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855348.07239: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855348.07273: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855348.07334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855348.07361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855348.07403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.07445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855348.07478: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855348.07486: when evaluation is False, skipping this task 30582 1726855348.07533: _execute() done 30582 1726855348.07536: dumping result to json 30582 1726855348.07539: done dumping result, returning 30582 1726855348.07542: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-0000000019c3] 30582 1726855348.07544: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c3 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855348.07697: no more pending results, returning what we have 30582 1726855348.07701: results queue empty 30582 1726855348.07703: checking for any_errors_fatal 30582 1726855348.07712: done checking for any_errors_fatal 30582 1726855348.07712: checking for max_fail_percentage 30582 1726855348.07715: done checking for max_fail_percentage 30582 1726855348.07716: checking to see if all hosts have failed and the running result is not ok 30582 1726855348.07717: done checking to see if all hosts have failed 30582 1726855348.07718: getting the remaining hosts for this loop 30582 1726855348.07719: done getting the remaining hosts for this loop 30582 1726855348.07724: getting the next task for host managed_node3 30582 1726855348.07733: done getting next task for host managed_node3 30582 1726855348.07737: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855348.07742: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855348.07773: getting variables 30582 1726855348.07776: in VariableManager get_vars() 30582 1726855348.07825: Calling all_inventory to load vars for managed_node3 30582 1726855348.07829: Calling groups_inventory to load vars for managed_node3 30582 1726855348.07831: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855348.07844: Calling all_plugins_play to load vars for managed_node3 30582 1726855348.07847: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855348.07850: Calling groups_plugins_play to load vars for managed_node3 30582 1726855348.08660: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c3 30582 1726855348.08663: WORKER PROCESS EXITING 30582 1726855348.09902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855348.11537: done with get_vars() 30582 1726855348.11569: done getting variables 30582 1726855348.11643: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:28 -0400 (0:00:00.115) 0:01:24.466 ****** 30582 1726855348.11691: entering _queue_task() for managed_node3/dnf 30582 1726855348.12081: worker is 1 (out of 1 available) 30582 1726855348.12236: exiting _queue_task() for managed_node3/dnf 30582 1726855348.12248: done queuing things up, now waiting for results queue to drain 30582 1726855348.12250: waiting for pending results... 30582 1726855348.12437: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855348.12625: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c4 30582 1726855348.12645: variable 'ansible_search_path' from source: unknown 30582 1726855348.12654: variable 'ansible_search_path' from source: unknown 30582 1726855348.12706: calling self._execute() 30582 1726855348.12808: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855348.12818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855348.12829: variable 'omit' from source: magic vars 30582 1726855348.13248: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.13264: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855348.13484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855348.15922: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855348.16003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855348.16051: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855348.16091: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855348.16160: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855348.16218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.16274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.16306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.16349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.16481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.16512: variable 'ansible_distribution' from source: facts 30582 1726855348.16523: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.16545: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855348.16668: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855348.16824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.16854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.16885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.16943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.16963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.17010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.17051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.17083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.17142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.17247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.17250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.17253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.17256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.17305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.17324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.17515: variable 'network_connections' from source: include params 30582 1726855348.17533: variable 'interface' from source: play vars 30582 1726855348.17614: variable 'interface' from source: play vars 30582 1726855348.17706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855348.17899: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855348.17945: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855348.18008: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855348.18024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855348.18069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855348.18097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855348.18236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.18239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855348.18242: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855348.18493: variable 'network_connections' from source: include params 30582 1726855348.18504: variable 'interface' from source: play vars 30582 1726855348.18577: variable 'interface' from source: play vars 30582 1726855348.18608: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855348.18616: when evaluation is False, skipping this task 30582 1726855348.18623: _execute() done 30582 1726855348.18630: dumping result to json 30582 1726855348.18636: done dumping result, returning 30582 1726855348.18671: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000019c4] 30582 1726855348.18674: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c4 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855348.19038: no more pending results, returning what we have 30582 1726855348.19043: results queue empty 30582 1726855348.19044: checking for any_errors_fatal 30582 1726855348.19050: done checking for any_errors_fatal 30582 1726855348.19051: checking for max_fail_percentage 30582 1726855348.19054: done checking for max_fail_percentage 30582 1726855348.19055: checking to see if all hosts have failed and the running result is not ok 30582 1726855348.19056: done checking to see if all hosts have failed 30582 1726855348.19056: getting the remaining hosts for this loop 30582 1726855348.19058: done getting the remaining hosts for this loop 30582 1726855348.19062: getting the next task for host managed_node3 30582 1726855348.19070: done getting next task for host managed_node3 30582 1726855348.19075: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855348.19080: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855348.19109: getting variables 30582 1726855348.19111: in VariableManager get_vars() 30582 1726855348.19156: Calling all_inventory to load vars for managed_node3 30582 1726855348.19159: Calling groups_inventory to load vars for managed_node3 30582 1726855348.19162: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855348.19173: Calling all_plugins_play to load vars for managed_node3 30582 1726855348.19176: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855348.19179: Calling groups_plugins_play to load vars for managed_node3 30582 1726855348.19706: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c4 30582 1726855348.19711: WORKER PROCESS EXITING 30582 1726855348.20843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855348.22735: done with get_vars() 30582 1726855348.22763: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855348.22855: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:28 -0400 (0:00:00.112) 0:01:24.578 ****** 30582 1726855348.22900: entering _queue_task() for managed_node3/yum 30582 1726855348.23504: worker is 1 (out of 1 available) 30582 1726855348.23516: exiting _queue_task() for managed_node3/yum 30582 1726855348.23525: done queuing things up, now waiting for results queue to drain 30582 1726855348.23527: waiting for pending results... 30582 1726855348.23652: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855348.23820: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c5 30582 1726855348.23845: variable 'ansible_search_path' from source: unknown 30582 1726855348.23855: variable 'ansible_search_path' from source: unknown 30582 1726855348.23906: calling self._execute() 30582 1726855348.24018: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855348.24029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855348.24043: variable 'omit' from source: magic vars 30582 1726855348.24456: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.24476: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855348.24753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855348.27074: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855348.27154: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855348.27193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855348.27230: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855348.27266: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855348.27357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.27423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.27458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.27519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.27540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.27659: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.27682: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855348.27701: when evaluation is False, skipping this task 30582 1726855348.27709: _execute() done 30582 1726855348.27716: dumping result to json 30582 1726855348.27793: done dumping result, returning 30582 1726855348.27806: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000019c5] 30582 1726855348.27810: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c5 30582 1726855348.27892: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c5 30582 1726855348.27896: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855348.27953: no more pending results, returning what we have 30582 1726855348.27957: results queue empty 30582 1726855348.27959: checking for any_errors_fatal 30582 1726855348.27968: done checking for any_errors_fatal 30582 1726855348.27969: checking for max_fail_percentage 30582 1726855348.27971: done checking for max_fail_percentage 30582 1726855348.27972: checking to see if all hosts have failed and the running result is not ok 30582 1726855348.27973: done checking to see if all hosts have failed 30582 1726855348.27974: getting the remaining hosts for this loop 30582 1726855348.27975: done getting the remaining hosts for this loop 30582 1726855348.27980: getting the next task for host managed_node3 30582 1726855348.27991: done getting next task for host managed_node3 30582 1726855348.27996: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855348.28000: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855348.28031: getting variables 30582 1726855348.28034: in VariableManager get_vars() 30582 1726855348.28081: Calling all_inventory to load vars for managed_node3 30582 1726855348.28085: Calling groups_inventory to load vars for managed_node3 30582 1726855348.28202: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855348.28217: Calling all_plugins_play to load vars for managed_node3 30582 1726855348.28221: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855348.28224: Calling groups_plugins_play to load vars for managed_node3 30582 1726855348.29896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855348.31589: done with get_vars() 30582 1726855348.31623: done getting variables 30582 1726855348.31695: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:28 -0400 (0:00:00.088) 0:01:24.667 ****** 30582 1726855348.31732: entering _queue_task() for managed_node3/fail 30582 1726855348.32202: worker is 1 (out of 1 available) 30582 1726855348.32216: exiting _queue_task() for managed_node3/fail 30582 1726855348.32338: done queuing things up, now waiting for results queue to drain 30582 1726855348.32340: waiting for pending results... 30582 1726855348.32698: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855348.32704: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c6 30582 1726855348.32707: variable 'ansible_search_path' from source: unknown 30582 1726855348.32710: variable 'ansible_search_path' from source: unknown 30582 1726855348.32744: calling self._execute() 30582 1726855348.32860: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855348.32871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855348.32886: variable 'omit' from source: magic vars 30582 1726855348.33339: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.33359: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855348.33494: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855348.33712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855348.36163: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855348.36168: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855348.36196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855348.36239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855348.36272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855348.36369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.36433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.36463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.36516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.36532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.36575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.36601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.36635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.36675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.36723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.36748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.36776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.36807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.36859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.36992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.37060: variable 'network_connections' from source: include params 30582 1726855348.37079: variable 'interface' from source: play vars 30582 1726855348.37162: variable 'interface' from source: play vars 30582 1726855348.37249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855348.37426: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855348.37478: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855348.37512: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855348.37548: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855348.37593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855348.37615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855348.37659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.37768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855348.37771: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855348.38033: variable 'network_connections' from source: include params 30582 1726855348.38044: variable 'interface' from source: play vars 30582 1726855348.38121: variable 'interface' from source: play vars 30582 1726855348.38154: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855348.38162: when evaluation is False, skipping this task 30582 1726855348.38170: _execute() done 30582 1726855348.38178: dumping result to json 30582 1726855348.38185: done dumping result, returning 30582 1726855348.38209: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000019c6] 30582 1726855348.38219: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c6 30582 1726855348.38455: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c6 30582 1726855348.38459: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855348.38520: no more pending results, returning what we have 30582 1726855348.38530: results queue empty 30582 1726855348.38532: checking for any_errors_fatal 30582 1726855348.38540: done checking for any_errors_fatal 30582 1726855348.38540: checking for max_fail_percentage 30582 1726855348.38543: done checking for max_fail_percentage 30582 1726855348.38544: checking to see if all hosts have failed and the running result is not ok 30582 1726855348.38545: done checking to see if all hosts have failed 30582 1726855348.38546: getting the remaining hosts for this loop 30582 1726855348.38547: done getting the remaining hosts for this loop 30582 1726855348.38552: getting the next task for host managed_node3 30582 1726855348.38561: done getting next task for host managed_node3 30582 1726855348.38566: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855348.38640: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855348.38673: getting variables 30582 1726855348.38675: in VariableManager get_vars() 30582 1726855348.38795: Calling all_inventory to load vars for managed_node3 30582 1726855348.38798: Calling groups_inventory to load vars for managed_node3 30582 1726855348.38801: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855348.38813: Calling all_plugins_play to load vars for managed_node3 30582 1726855348.38817: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855348.38820: Calling groups_plugins_play to load vars for managed_node3 30582 1726855348.40661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855348.42297: done with get_vars() 30582 1726855348.42331: done getting variables 30582 1726855348.42401: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:28 -0400 (0:00:00.107) 0:01:24.774 ****** 30582 1726855348.42440: entering _queue_task() for managed_node3/package 30582 1726855348.42947: worker is 1 (out of 1 available) 30582 1726855348.42961: exiting _queue_task() for managed_node3/package 30582 1726855348.42971: done queuing things up, now waiting for results queue to drain 30582 1726855348.42973: waiting for pending results... 30582 1726855348.43307: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855348.43395: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c7 30582 1726855348.43424: variable 'ansible_search_path' from source: unknown 30582 1726855348.43448: variable 'ansible_search_path' from source: unknown 30582 1726855348.43483: calling self._execute() 30582 1726855348.43642: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855348.43647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855348.43650: variable 'omit' from source: magic vars 30582 1726855348.44061: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.44083: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855348.44277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855348.44638: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855348.44642: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855348.44685: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855348.44776: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855348.44905: variable 'network_packages' from source: role '' defaults 30582 1726855348.45027: variable '__network_provider_setup' from source: role '' defaults 30582 1726855348.45043: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855348.45185: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855348.45190: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855348.45205: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855348.45379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855348.47543: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855348.47630: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855348.47675: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855348.47725: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855348.47803: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855348.47846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.47879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.47917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.47993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.47996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.48046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.48075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.48113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.48167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.48191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.48470: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855348.48604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.48633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.48675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.48765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.48768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.48856: variable 'ansible_python' from source: facts 30582 1726855348.48886: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855348.48985: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855348.49077: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855348.49392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.49395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.49398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.49400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.49402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.49404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.49425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.49457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.49506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.49535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.49678: variable 'network_connections' from source: include params 30582 1726855348.49691: variable 'interface' from source: play vars 30582 1726855348.49795: variable 'interface' from source: play vars 30582 1726855348.49896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855348.49928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855348.49969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.50008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855348.50060: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855348.50406: variable 'network_connections' from source: include params 30582 1726855348.50409: variable 'interface' from source: play vars 30582 1726855348.50489: variable 'interface' from source: play vars 30582 1726855348.50531: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855348.50611: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855348.50972: variable 'network_connections' from source: include params 30582 1726855348.50982: variable 'interface' from source: play vars 30582 1726855348.51061: variable 'interface' from source: play vars 30582 1726855348.51092: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855348.51191: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855348.51537: variable 'network_connections' from source: include params 30582 1726855348.51548: variable 'interface' from source: play vars 30582 1726855348.51675: variable 'interface' from source: play vars 30582 1726855348.51686: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855348.51756: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855348.51769: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855348.51846: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855348.52088: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855348.52585: variable 'network_connections' from source: include params 30582 1726855348.52597: variable 'interface' from source: play vars 30582 1726855348.52657: variable 'interface' from source: play vars 30582 1726855348.52695: variable 'ansible_distribution' from source: facts 30582 1726855348.52698: variable '__network_rh_distros' from source: role '' defaults 30582 1726855348.52700: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.52709: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855348.52891: variable 'ansible_distribution' from source: facts 30582 1726855348.52912: variable '__network_rh_distros' from source: role '' defaults 30582 1726855348.53092: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.53095: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855348.53097: variable 'ansible_distribution' from source: facts 30582 1726855348.53099: variable '__network_rh_distros' from source: role '' defaults 30582 1726855348.53100: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.53141: variable 'network_provider' from source: set_fact 30582 1726855348.53159: variable 'ansible_facts' from source: unknown 30582 1726855348.54012: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855348.54021: when evaluation is False, skipping this task 30582 1726855348.54029: _execute() done 30582 1726855348.54037: dumping result to json 30582 1726855348.54044: done dumping result, returning 30582 1726855348.54092: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-0000000019c7] 30582 1726855348.54104: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c7 30582 1726855348.54328: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c7 30582 1726855348.54332: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855348.54427: no more pending results, returning what we have 30582 1726855348.54433: results queue empty 30582 1726855348.54434: checking for any_errors_fatal 30582 1726855348.54444: done checking for any_errors_fatal 30582 1726855348.54445: checking for max_fail_percentage 30582 1726855348.54447: done checking for max_fail_percentage 30582 1726855348.54448: checking to see if all hosts have failed and the running result is not ok 30582 1726855348.54449: done checking to see if all hosts have failed 30582 1726855348.54450: getting the remaining hosts for this loop 30582 1726855348.54451: done getting the remaining hosts for this loop 30582 1726855348.54456: getting the next task for host managed_node3 30582 1726855348.54464: done getting next task for host managed_node3 30582 1726855348.54469: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855348.54474: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855348.54536: getting variables 30582 1726855348.54539: in VariableManager get_vars() 30582 1726855348.54699: Calling all_inventory to load vars for managed_node3 30582 1726855348.54703: Calling groups_inventory to load vars for managed_node3 30582 1726855348.54706: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855348.54717: Calling all_plugins_play to load vars for managed_node3 30582 1726855348.54721: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855348.54724: Calling groups_plugins_play to load vars for managed_node3 30582 1726855348.56379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855348.59057: done with get_vars() 30582 1726855348.59217: done getting variables 30582 1726855348.59292: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:28 -0400 (0:00:00.168) 0:01:24.943 ****** 30582 1726855348.59330: entering _queue_task() for managed_node3/package 30582 1726855348.60297: worker is 1 (out of 1 available) 30582 1726855348.60310: exiting _queue_task() for managed_node3/package 30582 1726855348.60321: done queuing things up, now waiting for results queue to drain 30582 1726855348.60322: waiting for pending results... 30582 1726855348.60717: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855348.60736: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c8 30582 1726855348.60807: variable 'ansible_search_path' from source: unknown 30582 1726855348.60820: variable 'ansible_search_path' from source: unknown 30582 1726855348.60861: calling self._execute() 30582 1726855348.60979: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855348.60995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855348.61010: variable 'omit' from source: magic vars 30582 1726855348.61733: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.61737: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855348.61972: variable 'network_state' from source: role '' defaults 30582 1726855348.62025: Evaluated conditional (network_state != {}): False 30582 1726855348.62169: when evaluation is False, skipping this task 30582 1726855348.62173: _execute() done 30582 1726855348.62176: dumping result to json 30582 1726855348.62178: done dumping result, returning 30582 1726855348.62181: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-0000000019c8] 30582 1726855348.62184: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c8 30582 1726855348.62465: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c8 30582 1726855348.62469: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855348.62527: no more pending results, returning what we have 30582 1726855348.62532: results queue empty 30582 1726855348.62533: checking for any_errors_fatal 30582 1726855348.62540: done checking for any_errors_fatal 30582 1726855348.62541: checking for max_fail_percentage 30582 1726855348.62548: done checking for max_fail_percentage 30582 1726855348.62549: checking to see if all hosts have failed and the running result is not ok 30582 1726855348.62550: done checking to see if all hosts have failed 30582 1726855348.62551: getting the remaining hosts for this loop 30582 1726855348.62553: done getting the remaining hosts for this loop 30582 1726855348.62557: getting the next task for host managed_node3 30582 1726855348.62566: done getting next task for host managed_node3 30582 1726855348.62571: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855348.62657: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855348.62773: getting variables 30582 1726855348.62775: in VariableManager get_vars() 30582 1726855348.63010: Calling all_inventory to load vars for managed_node3 30582 1726855348.63013: Calling groups_inventory to load vars for managed_node3 30582 1726855348.63016: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855348.63028: Calling all_plugins_play to load vars for managed_node3 30582 1726855348.63032: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855348.63035: Calling groups_plugins_play to load vars for managed_node3 30582 1726855348.64794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855348.68118: done with get_vars() 30582 1726855348.68159: done getting variables 30582 1726855348.68335: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:28 -0400 (0:00:00.090) 0:01:25.033 ****** 30582 1726855348.68375: entering _queue_task() for managed_node3/package 30582 1726855348.69296: worker is 1 (out of 1 available) 30582 1726855348.69309: exiting _queue_task() for managed_node3/package 30582 1726855348.69321: done queuing things up, now waiting for results queue to drain 30582 1726855348.69322: waiting for pending results... 30582 1726855348.69756: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855348.69847: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019c9 30582 1726855348.69935: variable 'ansible_search_path' from source: unknown 30582 1726855348.69939: variable 'ansible_search_path' from source: unknown 30582 1726855348.70136: calling self._execute() 30582 1726855348.70255: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855348.70267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855348.70293: variable 'omit' from source: magic vars 30582 1726855348.70704: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.70732: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855348.70852: variable 'network_state' from source: role '' defaults 30582 1726855348.70867: Evaluated conditional (network_state != {}): False 30582 1726855348.70892: when evaluation is False, skipping this task 30582 1726855348.70895: _execute() done 30582 1726855348.70898: dumping result to json 30582 1726855348.70900: done dumping result, returning 30582 1726855348.70905: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-0000000019c9] 30582 1726855348.70936: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c9 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855348.71245: no more pending results, returning what we have 30582 1726855348.71250: results queue empty 30582 1726855348.71251: checking for any_errors_fatal 30582 1726855348.71286: done checking for any_errors_fatal 30582 1726855348.71290: checking for max_fail_percentage 30582 1726855348.71293: done checking for max_fail_percentage 30582 1726855348.71294: checking to see if all hosts have failed and the running result is not ok 30582 1726855348.71295: done checking to see if all hosts have failed 30582 1726855348.71296: getting the remaining hosts for this loop 30582 1726855348.71297: done getting the remaining hosts for this loop 30582 1726855348.71302: getting the next task for host managed_node3 30582 1726855348.71312: done getting next task for host managed_node3 30582 1726855348.71317: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855348.71322: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855348.71610: getting variables 30582 1726855348.71613: in VariableManager get_vars() 30582 1726855348.71653: Calling all_inventory to load vars for managed_node3 30582 1726855348.71656: Calling groups_inventory to load vars for managed_node3 30582 1726855348.71659: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855348.71668: Calling all_plugins_play to load vars for managed_node3 30582 1726855348.71672: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855348.71675: Calling groups_plugins_play to load vars for managed_node3 30582 1726855348.72206: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019c9 30582 1726855348.72209: WORKER PROCESS EXITING 30582 1726855348.73191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855348.75056: done with get_vars() 30582 1726855348.75121: done getting variables 30582 1726855348.75195: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:28 -0400 (0:00:00.068) 0:01:25.102 ****** 30582 1726855348.75241: entering _queue_task() for managed_node3/service 30582 1726855348.75663: worker is 1 (out of 1 available) 30582 1726855348.75900: exiting _queue_task() for managed_node3/service 30582 1726855348.75912: done queuing things up, now waiting for results queue to drain 30582 1726855348.75914: waiting for pending results... 30582 1726855348.76028: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855348.76566: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019ca 30582 1726855348.76591: variable 'ansible_search_path' from source: unknown 30582 1726855348.76655: variable 'ansible_search_path' from source: unknown 30582 1726855348.76703: calling self._execute() 30582 1726855348.76976: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855348.76994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855348.77009: variable 'omit' from source: magic vars 30582 1726855348.77794: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.77908: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855348.78138: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855348.78379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855348.81971: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855348.82124: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855348.82248: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855348.82294: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855348.82385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855348.82597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.82633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.82718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.82897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.82901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.82950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.83033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.83068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.83439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.83444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.83447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855348.83477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855348.83538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.83605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855348.83649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855348.83885: variable 'network_connections' from source: include params 30582 1726855348.83910: variable 'interface' from source: play vars 30582 1726855348.83985: variable 'interface' from source: play vars 30582 1726855348.84068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855348.84255: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855348.84297: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855348.84334: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855348.84365: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855348.84410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855348.84442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855348.84472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855348.84546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855348.84566: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855348.84820: variable 'network_connections' from source: include params 30582 1726855348.84829: variable 'interface' from source: play vars 30582 1726855348.84889: variable 'interface' from source: play vars 30582 1726855348.84916: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855348.84923: when evaluation is False, skipping this task 30582 1726855348.84929: _execute() done 30582 1726855348.84981: dumping result to json 30582 1726855348.84984: done dumping result, returning 30582 1726855348.84986: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000019ca] 30582 1726855348.84990: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019ca 30582 1726855348.85068: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019ca skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855348.85206: no more pending results, returning what we have 30582 1726855348.85210: results queue empty 30582 1726855348.85211: checking for any_errors_fatal 30582 1726855348.85218: done checking for any_errors_fatal 30582 1726855348.85219: checking for max_fail_percentage 30582 1726855348.85221: done checking for max_fail_percentage 30582 1726855348.85222: checking to see if all hosts have failed and the running result is not ok 30582 1726855348.85223: done checking to see if all hosts have failed 30582 1726855348.85224: getting the remaining hosts for this loop 30582 1726855348.85225: done getting the remaining hosts for this loop 30582 1726855348.85229: getting the next task for host managed_node3 30582 1726855348.85237: done getting next task for host managed_node3 30582 1726855348.85242: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855348.85246: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855348.85261: WORKER PROCESS EXITING 30582 1726855348.85401: getting variables 30582 1726855348.85403: in VariableManager get_vars() 30582 1726855348.85440: Calling all_inventory to load vars for managed_node3 30582 1726855348.85443: Calling groups_inventory to load vars for managed_node3 30582 1726855348.85445: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855348.85454: Calling all_plugins_play to load vars for managed_node3 30582 1726855348.85456: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855348.85458: Calling groups_plugins_play to load vars for managed_node3 30582 1726855348.88462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855348.90457: done with get_vars() 30582 1726855348.90492: done getting variables 30582 1726855348.90548: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:28 -0400 (0:00:00.153) 0:01:25.255 ****** 30582 1726855348.90583: entering _queue_task() for managed_node3/service 30582 1726855348.91573: worker is 1 (out of 1 available) 30582 1726855348.91586: exiting _queue_task() for managed_node3/service 30582 1726855348.91602: done queuing things up, now waiting for results queue to drain 30582 1726855348.91604: waiting for pending results... 30582 1726855348.92310: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855348.92499: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019cb 30582 1726855348.92694: variable 'ansible_search_path' from source: unknown 30582 1726855348.92697: variable 'ansible_search_path' from source: unknown 30582 1726855348.92700: calling self._execute() 30582 1726855348.93084: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855348.93090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855348.93092: variable 'omit' from source: magic vars 30582 1726855348.94260: variable 'ansible_distribution_major_version' from source: facts 30582 1726855348.94403: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855348.94893: variable 'network_provider' from source: set_fact 30582 1726855348.94897: variable 'network_state' from source: role '' defaults 30582 1726855348.94900: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855348.94903: variable 'omit' from source: magic vars 30582 1726855348.94906: variable 'omit' from source: magic vars 30582 1726855348.95022: variable 'network_service_name' from source: role '' defaults 30582 1726855348.95215: variable 'network_service_name' from source: role '' defaults 30582 1726855348.95445: variable '__network_provider_setup' from source: role '' defaults 30582 1726855348.95458: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855348.95721: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855348.95724: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855348.95727: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855348.96140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855349.00775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855349.00906: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855349.01093: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855349.01124: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855349.01157: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855349.01362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855349.01407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855349.01604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855349.01607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855349.01610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855349.01723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855349.01842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855349.01927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855349.01965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855349.02146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855349.02527: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855349.02847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855349.02878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855349.02959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855349.03048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855349.03136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855349.03329: variable 'ansible_python' from source: facts 30582 1726855349.03486: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855349.03706: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855349.03772: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855349.04091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855349.04263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855349.04381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855349.04384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855349.04386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855349.04467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855349.04643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855349.04825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855349.04900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855349.05495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855349.05566: variable 'network_connections' from source: include params 30582 1726855349.05835: variable 'interface' from source: play vars 30582 1726855349.06252: variable 'interface' from source: play vars 30582 1726855349.06490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855349.06917: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855349.06974: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855349.07144: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855349.07192: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855349.07419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855349.07460: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855349.07663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855349.07666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855349.07792: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855349.08396: variable 'network_connections' from source: include params 30582 1726855349.08406: variable 'interface' from source: play vars 30582 1726855349.08694: variable 'interface' from source: play vars 30582 1726855349.08697: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855349.08824: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855349.09466: variable 'network_connections' from source: include params 30582 1726855349.09478: variable 'interface' from source: play vars 30582 1726855349.09645: variable 'interface' from source: play vars 30582 1726855349.09675: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855349.09810: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855349.10467: variable 'network_connections' from source: include params 30582 1726855349.10502: variable 'interface' from source: play vars 30582 1726855349.10670: variable 'interface' from source: play vars 30582 1726855349.10829: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855349.10929: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855349.10943: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855349.11145: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855349.11495: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855349.12647: variable 'network_connections' from source: include params 30582 1726855349.12661: variable 'interface' from source: play vars 30582 1726855349.12811: variable 'interface' from source: play vars 30582 1726855349.12838: variable 'ansible_distribution' from source: facts 30582 1726855349.12876: variable '__network_rh_distros' from source: role '' defaults 30582 1726855349.12886: variable 'ansible_distribution_major_version' from source: facts 30582 1726855349.12908: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855349.13318: variable 'ansible_distribution' from source: facts 30582 1726855349.13329: variable '__network_rh_distros' from source: role '' defaults 30582 1726855349.13338: variable 'ansible_distribution_major_version' from source: facts 30582 1726855349.13361: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855349.13896: variable 'ansible_distribution' from source: facts 30582 1726855349.13900: variable '__network_rh_distros' from source: role '' defaults 30582 1726855349.13904: variable 'ansible_distribution_major_version' from source: facts 30582 1726855349.13906: variable 'network_provider' from source: set_fact 30582 1726855349.13909: variable 'omit' from source: magic vars 30582 1726855349.13912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855349.14003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855349.14007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855349.14009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855349.14012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855349.14117: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855349.14125: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855349.14132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855349.14293: Set connection var ansible_timeout to 10 30582 1726855349.14444: Set connection var ansible_connection to ssh 30582 1726855349.14457: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855349.14475: Set connection var ansible_pipelining to False 30582 1726855349.14484: Set connection var ansible_shell_executable to /bin/sh 30582 1726855349.14494: Set connection var ansible_shell_type to sh 30582 1726855349.14520: variable 'ansible_shell_executable' from source: unknown 30582 1726855349.14526: variable 'ansible_connection' from source: unknown 30582 1726855349.14532: variable 'ansible_module_compression' from source: unknown 30582 1726855349.14536: variable 'ansible_shell_type' from source: unknown 30582 1726855349.14546: variable 'ansible_shell_executable' from source: unknown 30582 1726855349.14551: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855349.14557: variable 'ansible_pipelining' from source: unknown 30582 1726855349.14562: variable 'ansible_timeout' from source: unknown 30582 1726855349.14577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855349.14821: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855349.14862: variable 'omit' from source: magic vars 30582 1726855349.14900: starting attempt loop 30582 1726855349.14909: running the handler 30582 1726855349.15045: variable 'ansible_facts' from source: unknown 30582 1726855349.31923: _low_level_execute_command(): starting 30582 1726855349.31940: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855349.34014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855349.34233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855349.34397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855349.36005: stdout chunk (state=3): >>>/root <<< 30582 1726855349.36172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855349.36184: stdout chunk (state=3): >>><<< 30582 1726855349.36611: stderr chunk (state=3): >>><<< 30582 1726855349.36615: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855349.36618: _low_level_execute_command(): starting 30582 1726855349.36621: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267 `" && echo ansible-tmp-1726855349.3652308-34563-153938446507267="` echo /root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267 `" ) && sleep 0' 30582 1726855349.38337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855349.38391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855349.38668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855349.38681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855349.38729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855349.38905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855349.40854: stdout chunk (state=3): >>>ansible-tmp-1726855349.3652308-34563-153938446507267=/root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267 <<< 30582 1726855349.40952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855349.40986: stderr chunk (state=3): >>><<< 30582 1726855349.41000: stdout chunk (state=3): >>><<< 30582 1726855349.41395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855349.3652308-34563-153938446507267=/root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855349.41399: variable 'ansible_module_compression' from source: unknown 30582 1726855349.41401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855349.41403: variable 'ansible_facts' from source: unknown 30582 1726855349.41893: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/AnsiballZ_systemd.py 30582 1726855349.42094: Sending initial data 30582 1726855349.42103: Sent initial data (156 bytes) 30582 1726855349.43898: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855349.43915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855349.44137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855349.44207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855349.44219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855349.44404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855349.44516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855349.46218: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855349.46223: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855349.46281: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpx45gc6xe /root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/AnsiballZ_systemd.py <<< 30582 1726855349.46296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/AnsiballZ_systemd.py" <<< 30582 1726855349.46374: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpx45gc6xe" to remote "/root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/AnsiballZ_systemd.py" <<< 30582 1726855349.48793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855349.48849: stderr chunk (state=3): >>><<< 30582 1726855349.48856: stdout chunk (state=3): >>><<< 30582 1726855349.48903: done transferring module to remote 30582 1726855349.49068: _low_level_execute_command(): starting 30582 1726855349.49071: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/ /root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/AnsiballZ_systemd.py && sleep 0' 30582 1726855349.50394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855349.50685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855349.50692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855349.50695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855349.50965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855349.51019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855349.53134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855349.53145: stdout chunk (state=3): >>><<< 30582 1726855349.53157: stderr chunk (state=3): >>><<< 30582 1726855349.53177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855349.53186: _low_level_execute_command(): starting 30582 1726855349.53198: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/AnsiballZ_systemd.py && sleep 0' 30582 1726855349.54392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855349.54508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855349.54734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855349.54797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855349.84105: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10670080", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3322609664", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2215712000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855349.84147: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855349.86083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855349.86089: stdout chunk (state=3): >>><<< 30582 1726855349.86092: stderr chunk (state=3): >>><<< 30582 1726855349.86117: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10670080", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3322609664", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2215712000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855349.86537: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855349.86551: _low_level_execute_command(): starting 30582 1726855349.86560: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855349.3652308-34563-153938446507267/ > /dev/null 2>&1 && sleep 0' 30582 1726855349.87872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855349.87914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855349.87931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855349.87950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855349.87969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855349.88063: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855349.88175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855349.88302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855349.90233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855349.90282: stdout chunk (state=3): >>><<< 30582 1726855349.90311: stderr chunk (state=3): >>><<< 30582 1726855349.90797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855349.90801: handler run complete 30582 1726855349.90804: attempt loop complete, returning result 30582 1726855349.90806: _execute() done 30582 1726855349.90808: dumping result to json 30582 1726855349.90810: done dumping result, returning 30582 1726855349.90812: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-0000000019cb] 30582 1726855349.90814: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019cb ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855349.91160: no more pending results, returning what we have 30582 1726855349.91164: results queue empty 30582 1726855349.91168: checking for any_errors_fatal 30582 1726855349.91175: done checking for any_errors_fatal 30582 1726855349.91175: checking for max_fail_percentage 30582 1726855349.91178: done checking for max_fail_percentage 30582 1726855349.91179: checking to see if all hosts have failed and the running result is not ok 30582 1726855349.91180: done checking to see if all hosts have failed 30582 1726855349.91180: getting the remaining hosts for this loop 30582 1726855349.91182: done getting the remaining hosts for this loop 30582 1726855349.91185: getting the next task for host managed_node3 30582 1726855349.91196: done getting next task for host managed_node3 30582 1726855349.91200: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855349.91205: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855349.91220: getting variables 30582 1726855349.91222: in VariableManager get_vars() 30582 1726855349.91504: Calling all_inventory to load vars for managed_node3 30582 1726855349.91513: Calling groups_inventory to load vars for managed_node3 30582 1726855349.91516: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855349.91526: Calling all_plugins_play to load vars for managed_node3 30582 1726855349.91530: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855349.91538: Calling groups_plugins_play to load vars for managed_node3 30582 1726855349.92243: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019cb 30582 1726855349.92247: WORKER PROCESS EXITING 30582 1726855349.94866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855350.01510: done with get_vars() 30582 1726855350.01531: done getting variables 30582 1726855350.01571: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:30 -0400 (0:00:01.110) 0:01:26.365 ****** 30582 1726855350.01602: entering _queue_task() for managed_node3/service 30582 1726855350.01880: worker is 1 (out of 1 available) 30582 1726855350.01897: exiting _queue_task() for managed_node3/service 30582 1726855350.01909: done queuing things up, now waiting for results queue to drain 30582 1726855350.01911: waiting for pending results... 30582 1726855350.02099: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855350.02194: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019cc 30582 1726855350.02206: variable 'ansible_search_path' from source: unknown 30582 1726855350.02210: variable 'ansible_search_path' from source: unknown 30582 1726855350.02237: calling self._execute() 30582 1726855350.02319: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.02323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.02331: variable 'omit' from source: magic vars 30582 1726855350.02621: variable 'ansible_distribution_major_version' from source: facts 30582 1726855350.02631: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855350.02719: variable 'network_provider' from source: set_fact 30582 1726855350.02725: Evaluated conditional (network_provider == "nm"): True 30582 1726855350.02794: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855350.02855: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855350.02974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855350.04468: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855350.04521: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855350.04549: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855350.04575: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855350.04598: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855350.04672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855350.04692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855350.04710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855350.04737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855350.04747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855350.04783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855350.04802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855350.04819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855350.04842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855350.04853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855350.04884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855350.04902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855350.04917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855350.04941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855350.04951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855350.05057: variable 'network_connections' from source: include params 30582 1726855350.05070: variable 'interface' from source: play vars 30582 1726855350.05122: variable 'interface' from source: play vars 30582 1726855350.05172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855350.05283: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855350.05323: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855350.05345: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855350.05369: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855350.05400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855350.05419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855350.05435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855350.05452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855350.05495: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855350.05659: variable 'network_connections' from source: include params 30582 1726855350.05663: variable 'interface' from source: play vars 30582 1726855350.05711: variable 'interface' from source: play vars 30582 1726855350.05737: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855350.05741: when evaluation is False, skipping this task 30582 1726855350.05744: _execute() done 30582 1726855350.05746: dumping result to json 30582 1726855350.05748: done dumping result, returning 30582 1726855350.05751: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-0000000019cc] 30582 1726855350.05770: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019cc 30582 1726855350.05953: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019cc 30582 1726855350.05956: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855350.06010: no more pending results, returning what we have 30582 1726855350.06014: results queue empty 30582 1726855350.06015: checking for any_errors_fatal 30582 1726855350.06029: done checking for any_errors_fatal 30582 1726855350.06030: checking for max_fail_percentage 30582 1726855350.06032: done checking for max_fail_percentage 30582 1726855350.06033: checking to see if all hosts have failed and the running result is not ok 30582 1726855350.06033: done checking to see if all hosts have failed 30582 1726855350.06034: getting the remaining hosts for this loop 30582 1726855350.06035: done getting the remaining hosts for this loop 30582 1726855350.06038: getting the next task for host managed_node3 30582 1726855350.06045: done getting next task for host managed_node3 30582 1726855350.06049: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855350.06053: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855350.06078: getting variables 30582 1726855350.06079: in VariableManager get_vars() 30582 1726855350.06117: Calling all_inventory to load vars for managed_node3 30582 1726855350.06120: Calling groups_inventory to load vars for managed_node3 30582 1726855350.06121: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855350.06130: Calling all_plugins_play to load vars for managed_node3 30582 1726855350.06132: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855350.06135: Calling groups_plugins_play to load vars for managed_node3 30582 1726855350.07398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855350.08575: done with get_vars() 30582 1726855350.08600: done getting variables 30582 1726855350.08644: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:30 -0400 (0:00:00.070) 0:01:26.436 ****** 30582 1726855350.08674: entering _queue_task() for managed_node3/service 30582 1726855350.08943: worker is 1 (out of 1 available) 30582 1726855350.08957: exiting _queue_task() for managed_node3/service 30582 1726855350.08972: done queuing things up, now waiting for results queue to drain 30582 1726855350.08974: waiting for pending results... 30582 1726855350.09159: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855350.09258: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019cd 30582 1726855350.09270: variable 'ansible_search_path' from source: unknown 30582 1726855350.09274: variable 'ansible_search_path' from source: unknown 30582 1726855350.09306: calling self._execute() 30582 1726855350.09390: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.09395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.09401: variable 'omit' from source: magic vars 30582 1726855350.09684: variable 'ansible_distribution_major_version' from source: facts 30582 1726855350.09696: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855350.09781: variable 'network_provider' from source: set_fact 30582 1726855350.09785: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855350.09789: when evaluation is False, skipping this task 30582 1726855350.09793: _execute() done 30582 1726855350.09795: dumping result to json 30582 1726855350.09798: done dumping result, returning 30582 1726855350.09807: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-0000000019cd] 30582 1726855350.09811: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019cd 30582 1726855350.09905: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019cd 30582 1726855350.09907: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855350.09947: no more pending results, returning what we have 30582 1726855350.09951: results queue empty 30582 1726855350.09952: checking for any_errors_fatal 30582 1726855350.09961: done checking for any_errors_fatal 30582 1726855350.09962: checking for max_fail_percentage 30582 1726855350.09964: done checking for max_fail_percentage 30582 1726855350.09967: checking to see if all hosts have failed and the running result is not ok 30582 1726855350.09968: done checking to see if all hosts have failed 30582 1726855350.09968: getting the remaining hosts for this loop 30582 1726855350.09970: done getting the remaining hosts for this loop 30582 1726855350.09973: getting the next task for host managed_node3 30582 1726855350.09983: done getting next task for host managed_node3 30582 1726855350.09988: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855350.09995: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855350.10022: getting variables 30582 1726855350.10024: in VariableManager get_vars() 30582 1726855350.10063: Calling all_inventory to load vars for managed_node3 30582 1726855350.10068: Calling groups_inventory to load vars for managed_node3 30582 1726855350.10070: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855350.10081: Calling all_plugins_play to load vars for managed_node3 30582 1726855350.10083: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855350.10085: Calling groups_plugins_play to load vars for managed_node3 30582 1726855350.11038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855350.11910: done with get_vars() 30582 1726855350.11930: done getting variables 30582 1726855350.11978: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:30 -0400 (0:00:00.033) 0:01:26.469 ****** 30582 1726855350.12007: entering _queue_task() for managed_node3/copy 30582 1726855350.12279: worker is 1 (out of 1 available) 30582 1726855350.12296: exiting _queue_task() for managed_node3/copy 30582 1726855350.12309: done queuing things up, now waiting for results queue to drain 30582 1726855350.12311: waiting for pending results... 30582 1726855350.12502: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855350.12626: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019ce 30582 1726855350.12637: variable 'ansible_search_path' from source: unknown 30582 1726855350.12642: variable 'ansible_search_path' from source: unknown 30582 1726855350.12673: calling self._execute() 30582 1726855350.12749: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.12754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.12763: variable 'omit' from source: magic vars 30582 1726855350.13046: variable 'ansible_distribution_major_version' from source: facts 30582 1726855350.13057: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855350.13143: variable 'network_provider' from source: set_fact 30582 1726855350.13146: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855350.13149: when evaluation is False, skipping this task 30582 1726855350.13152: _execute() done 30582 1726855350.13155: dumping result to json 30582 1726855350.13157: done dumping result, returning 30582 1726855350.13169: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-0000000019ce] 30582 1726855350.13172: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019ce 30582 1726855350.13269: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019ce 30582 1726855350.13272: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855350.13317: no more pending results, returning what we have 30582 1726855350.13322: results queue empty 30582 1726855350.13323: checking for any_errors_fatal 30582 1726855350.13331: done checking for any_errors_fatal 30582 1726855350.13332: checking for max_fail_percentage 30582 1726855350.13334: done checking for max_fail_percentage 30582 1726855350.13335: checking to see if all hosts have failed and the running result is not ok 30582 1726855350.13335: done checking to see if all hosts have failed 30582 1726855350.13336: getting the remaining hosts for this loop 30582 1726855350.13337: done getting the remaining hosts for this loop 30582 1726855350.13341: getting the next task for host managed_node3 30582 1726855350.13350: done getting next task for host managed_node3 30582 1726855350.13353: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855350.13359: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855350.13389: getting variables 30582 1726855350.13391: in VariableManager get_vars() 30582 1726855350.13430: Calling all_inventory to load vars for managed_node3 30582 1726855350.13433: Calling groups_inventory to load vars for managed_node3 30582 1726855350.13435: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855350.13445: Calling all_plugins_play to load vars for managed_node3 30582 1726855350.13448: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855350.13450: Calling groups_plugins_play to load vars for managed_node3 30582 1726855350.14273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855350.15153: done with get_vars() 30582 1726855350.15174: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:30 -0400 (0:00:00.032) 0:01:26.502 ****** 30582 1726855350.15240: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855350.15507: worker is 1 (out of 1 available) 30582 1726855350.15523: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855350.15536: done queuing things up, now waiting for results queue to drain 30582 1726855350.15537: waiting for pending results... 30582 1726855350.15726: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855350.15833: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019cf 30582 1726855350.15843: variable 'ansible_search_path' from source: unknown 30582 1726855350.15846: variable 'ansible_search_path' from source: unknown 30582 1726855350.15880: calling self._execute() 30582 1726855350.15957: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.15961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.15970: variable 'omit' from source: magic vars 30582 1726855350.16254: variable 'ansible_distribution_major_version' from source: facts 30582 1726855350.16263: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855350.16269: variable 'omit' from source: magic vars 30582 1726855350.16314: variable 'omit' from source: magic vars 30582 1726855350.16428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855350.18193: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855350.18239: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855350.18278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855350.18306: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855350.18326: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855350.18392: variable 'network_provider' from source: set_fact 30582 1726855350.18491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855350.18512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855350.18531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855350.18557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855350.18570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855350.18629: variable 'omit' from source: magic vars 30582 1726855350.18707: variable 'omit' from source: magic vars 30582 1726855350.18776: variable 'network_connections' from source: include params 30582 1726855350.18786: variable 'interface' from source: play vars 30582 1726855350.18833: variable 'interface' from source: play vars 30582 1726855350.18937: variable 'omit' from source: magic vars 30582 1726855350.18945: variable '__lsr_ansible_managed' from source: task vars 30582 1726855350.18986: variable '__lsr_ansible_managed' from source: task vars 30582 1726855350.19113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855350.19260: Loaded config def from plugin (lookup/template) 30582 1726855350.19263: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855350.19286: File lookup term: get_ansible_managed.j2 30582 1726855350.19291: variable 'ansible_search_path' from source: unknown 30582 1726855350.19294: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855350.19306: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855350.19319: variable 'ansible_search_path' from source: unknown 30582 1726855350.22725: variable 'ansible_managed' from source: unknown 30582 1726855350.22818: variable 'omit' from source: magic vars 30582 1726855350.22840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855350.22860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855350.22875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855350.22889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855350.22897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855350.22919: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855350.22922: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.22925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.22990: Set connection var ansible_timeout to 10 30582 1726855350.22993: Set connection var ansible_connection to ssh 30582 1726855350.22999: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855350.23004: Set connection var ansible_pipelining to False 30582 1726855350.23008: Set connection var ansible_shell_executable to /bin/sh 30582 1726855350.23011: Set connection var ansible_shell_type to sh 30582 1726855350.23030: variable 'ansible_shell_executable' from source: unknown 30582 1726855350.23033: variable 'ansible_connection' from source: unknown 30582 1726855350.23036: variable 'ansible_module_compression' from source: unknown 30582 1726855350.23038: variable 'ansible_shell_type' from source: unknown 30582 1726855350.23040: variable 'ansible_shell_executable' from source: unknown 30582 1726855350.23042: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.23044: variable 'ansible_pipelining' from source: unknown 30582 1726855350.23048: variable 'ansible_timeout' from source: unknown 30582 1726855350.23052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.23145: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855350.23156: variable 'omit' from source: magic vars 30582 1726855350.23159: starting attempt loop 30582 1726855350.23162: running the handler 30582 1726855350.23175: _low_level_execute_command(): starting 30582 1726855350.23180: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855350.23658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855350.23676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855350.23680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.23697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.23745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855350.23749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855350.23759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855350.23833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855350.25539: stdout chunk (state=3): >>>/root <<< 30582 1726855350.25637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855350.25668: stderr chunk (state=3): >>><<< 30582 1726855350.25672: stdout chunk (state=3): >>><<< 30582 1726855350.25692: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855350.25703: _low_level_execute_command(): starting 30582 1726855350.25710: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710 `" && echo ansible-tmp-1726855350.2569323-34644-229986470922710="` echo /root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710 `" ) && sleep 0' 30582 1726855350.26145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855350.26184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855350.26189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855350.26192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.26194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855350.26196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855350.26198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.26244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855350.26247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855350.26249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855350.26313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855350.28218: stdout chunk (state=3): >>>ansible-tmp-1726855350.2569323-34644-229986470922710=/root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710 <<< 30582 1726855350.28318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855350.28348: stderr chunk (state=3): >>><<< 30582 1726855350.28351: stdout chunk (state=3): >>><<< 30582 1726855350.28368: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855350.2569323-34644-229986470922710=/root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855350.28412: variable 'ansible_module_compression' from source: unknown 30582 1726855350.28451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855350.28495: variable 'ansible_facts' from source: unknown 30582 1726855350.28585: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/AnsiballZ_network_connections.py 30582 1726855350.28688: Sending initial data 30582 1726855350.28694: Sent initial data (168 bytes) 30582 1726855350.29150: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855350.29158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855350.29161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.29163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855350.29167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.29221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855350.29224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855350.29226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855350.29285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855350.30863: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855350.30916: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855350.30977: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxah55h0j /root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/AnsiballZ_network_connections.py <<< 30582 1726855350.30980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/AnsiballZ_network_connections.py" <<< 30582 1726855350.31044: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxah55h0j" to remote "/root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/AnsiballZ_network_connections.py" <<< 30582 1726855350.31048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/AnsiballZ_network_connections.py" <<< 30582 1726855350.31873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855350.31877: stderr chunk (state=3): >>><<< 30582 1726855350.31882: stdout chunk (state=3): >>><<< 30582 1726855350.31912: done transferring module to remote 30582 1726855350.31922: _low_level_execute_command(): starting 30582 1726855350.31927: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/ /root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/AnsiballZ_network_connections.py && sleep 0' 30582 1726855350.32363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855350.32397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855350.32400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.32403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855350.32405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855350.32407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.32459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855350.32464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855350.32470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855350.32521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855350.34355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855350.34390: stderr chunk (state=3): >>><<< 30582 1726855350.34394: stdout chunk (state=3): >>><<< 30582 1726855350.34408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855350.34412: _low_level_execute_command(): starting 30582 1726855350.34416: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/AnsiballZ_network_connections.py && sleep 0' 30582 1726855350.34853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855350.34857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855350.34884: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.34890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855350.34897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.34948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855350.34952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855350.34954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855350.35036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855350.60262: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855350.62194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855350.62199: stdout chunk (state=3): >>><<< 30582 1726855350.62201: stderr chunk (state=3): >>><<< 30582 1726855350.62204: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855350.62206: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855350.62209: _low_level_execute_command(): starting 30582 1726855350.62211: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855350.2569323-34644-229986470922710/ > /dev/null 2>&1 && sleep 0' 30582 1726855350.62806: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855350.62837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855350.62840: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855350.62842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855350.62844: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855350.62902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855350.62910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855350.62922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855350.62991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855350.64994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855350.64998: stdout chunk (state=3): >>><<< 30582 1726855350.65001: stderr chunk (state=3): >>><<< 30582 1726855350.65003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855350.65005: handler run complete 30582 1726855350.65007: attempt loop complete, returning result 30582 1726855350.65013: _execute() done 30582 1726855350.65015: dumping result to json 30582 1726855350.65017: done dumping result, returning 30582 1726855350.65019: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-0000000019cf] 30582 1726855350.65021: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019cf ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced skipped because already active 30582 1726855350.65194: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019cf 30582 1726855350.65199: WORKER PROCESS EXITING 30582 1726855350.65218: no more pending results, returning what we have 30582 1726855350.65221: results queue empty 30582 1726855350.65222: checking for any_errors_fatal 30582 1726855350.65230: done checking for any_errors_fatal 30582 1726855350.65231: checking for max_fail_percentage 30582 1726855350.65234: done checking for max_fail_percentage 30582 1726855350.65235: checking to see if all hosts have failed and the running result is not ok 30582 1726855350.65235: done checking to see if all hosts have failed 30582 1726855350.65236: getting the remaining hosts for this loop 30582 1726855350.65238: done getting the remaining hosts for this loop 30582 1726855350.65241: getting the next task for host managed_node3 30582 1726855350.65249: done getting next task for host managed_node3 30582 1726855350.65253: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855350.65259: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855350.65276: getting variables 30582 1726855350.65278: in VariableManager get_vars() 30582 1726855350.65406: Calling all_inventory to load vars for managed_node3 30582 1726855350.65409: Calling groups_inventory to load vars for managed_node3 30582 1726855350.65411: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855350.65501: Calling all_plugins_play to load vars for managed_node3 30582 1726855350.65505: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855350.65508: Calling groups_plugins_play to load vars for managed_node3 30582 1726855350.66980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855350.68055: done with get_vars() 30582 1726855350.68080: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:30 -0400 (0:00:00.529) 0:01:27.031 ****** 30582 1726855350.68149: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855350.68419: worker is 1 (out of 1 available) 30582 1726855350.68435: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855350.68448: done queuing things up, now waiting for results queue to drain 30582 1726855350.68450: waiting for pending results... 30582 1726855350.68647: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855350.68755: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019d0 30582 1726855350.68766: variable 'ansible_search_path' from source: unknown 30582 1726855350.68770: variable 'ansible_search_path' from source: unknown 30582 1726855350.68806: calling self._execute() 30582 1726855350.68882: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.68891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.68905: variable 'omit' from source: magic vars 30582 1726855350.69326: variable 'ansible_distribution_major_version' from source: facts 30582 1726855350.69330: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855350.69411: variable 'network_state' from source: role '' defaults 30582 1726855350.69434: Evaluated conditional (network_state != {}): False 30582 1726855350.69437: when evaluation is False, skipping this task 30582 1726855350.69440: _execute() done 30582 1726855350.69444: dumping result to json 30582 1726855350.69446: done dumping result, returning 30582 1726855350.69448: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-0000000019d0] 30582 1726855350.69451: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d0 30582 1726855350.69546: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d0 30582 1726855350.69548: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855350.69598: no more pending results, returning what we have 30582 1726855350.69603: results queue empty 30582 1726855350.69604: checking for any_errors_fatal 30582 1726855350.69615: done checking for any_errors_fatal 30582 1726855350.69615: checking for max_fail_percentage 30582 1726855350.69617: done checking for max_fail_percentage 30582 1726855350.69618: checking to see if all hosts have failed and the running result is not ok 30582 1726855350.69619: done checking to see if all hosts have failed 30582 1726855350.69620: getting the remaining hosts for this loop 30582 1726855350.69621: done getting the remaining hosts for this loop 30582 1726855350.69625: getting the next task for host managed_node3 30582 1726855350.69633: done getting next task for host managed_node3 30582 1726855350.69637: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855350.69642: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855350.69673: getting variables 30582 1726855350.69675: in VariableManager get_vars() 30582 1726855350.69718: Calling all_inventory to load vars for managed_node3 30582 1726855350.69721: Calling groups_inventory to load vars for managed_node3 30582 1726855350.69723: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855350.69733: Calling all_plugins_play to load vars for managed_node3 30582 1726855350.69736: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855350.69738: Calling groups_plugins_play to load vars for managed_node3 30582 1726855350.71964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855350.73868: done with get_vars() 30582 1726855350.73894: done getting variables 30582 1726855350.73984: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:30 -0400 (0:00:00.058) 0:01:27.090 ****** 30582 1726855350.74028: entering _queue_task() for managed_node3/debug 30582 1726855350.74435: worker is 1 (out of 1 available) 30582 1726855350.74449: exiting _queue_task() for managed_node3/debug 30582 1726855350.74466: done queuing things up, now waiting for results queue to drain 30582 1726855350.74468: waiting for pending results... 30582 1726855350.75057: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855350.75102: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019d1 30582 1726855350.75123: variable 'ansible_search_path' from source: unknown 30582 1726855350.75131: variable 'ansible_search_path' from source: unknown 30582 1726855350.75173: calling self._execute() 30582 1726855350.75304: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.75327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.75342: variable 'omit' from source: magic vars 30582 1726855350.75973: variable 'ansible_distribution_major_version' from source: facts 30582 1726855350.75976: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855350.76298: variable 'omit' from source: magic vars 30582 1726855350.76302: variable 'omit' from source: magic vars 30582 1726855350.76406: variable 'omit' from source: magic vars 30582 1726855350.76410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855350.76413: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855350.76428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855350.76526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855350.76544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855350.76584: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855350.76630: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.76639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.76875: Set connection var ansible_timeout to 10 30582 1726855350.76956: Set connection var ansible_connection to ssh 30582 1726855350.76970: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855350.76981: Set connection var ansible_pipelining to False 30582 1726855350.76994: Set connection var ansible_shell_executable to /bin/sh 30582 1726855350.77001: Set connection var ansible_shell_type to sh 30582 1726855350.77034: variable 'ansible_shell_executable' from source: unknown 30582 1726855350.77066: variable 'ansible_connection' from source: unknown 30582 1726855350.77075: variable 'ansible_module_compression' from source: unknown 30582 1726855350.77083: variable 'ansible_shell_type' from source: unknown 30582 1726855350.77279: variable 'ansible_shell_executable' from source: unknown 30582 1726855350.77282: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.77284: variable 'ansible_pipelining' from source: unknown 30582 1726855350.77288: variable 'ansible_timeout' from source: unknown 30582 1726855350.77291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.77451: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855350.77477: variable 'omit' from source: magic vars 30582 1726855350.77492: starting attempt loop 30582 1726855350.77503: running the handler 30582 1726855350.77661: variable '__network_connections_result' from source: set_fact 30582 1726855350.77728: handler run complete 30582 1726855350.77750: attempt loop complete, returning result 30582 1726855350.77757: _execute() done 30582 1726855350.77763: dumping result to json 30582 1726855350.77823: done dumping result, returning 30582 1726855350.77827: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-0000000019d1] 30582 1726855350.77829: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d1 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced skipped because already active" ] } 30582 1726855350.78003: no more pending results, returning what we have 30582 1726855350.78007: results queue empty 30582 1726855350.78009: checking for any_errors_fatal 30582 1726855350.78017: done checking for any_errors_fatal 30582 1726855350.78018: checking for max_fail_percentage 30582 1726855350.78020: done checking for max_fail_percentage 30582 1726855350.78021: checking to see if all hosts have failed and the running result is not ok 30582 1726855350.78022: done checking to see if all hosts have failed 30582 1726855350.78023: getting the remaining hosts for this loop 30582 1726855350.78024: done getting the remaining hosts for this loop 30582 1726855350.78028: getting the next task for host managed_node3 30582 1726855350.78048: done getting next task for host managed_node3 30582 1726855350.78052: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855350.78059: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855350.78076: getting variables 30582 1726855350.78078: in VariableManager get_vars() 30582 1726855350.78235: Calling all_inventory to load vars for managed_node3 30582 1726855350.78238: Calling groups_inventory to load vars for managed_node3 30582 1726855350.78241: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855350.78247: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d1 30582 1726855350.78250: WORKER PROCESS EXITING 30582 1726855350.78263: Calling all_plugins_play to load vars for managed_node3 30582 1726855350.78266: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855350.78270: Calling groups_plugins_play to load vars for managed_node3 30582 1726855350.79951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855350.81645: done with get_vars() 30582 1726855350.81684: done getting variables 30582 1726855350.81751: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:30 -0400 (0:00:00.077) 0:01:27.167 ****** 30582 1726855350.81808: entering _queue_task() for managed_node3/debug 30582 1726855350.82201: worker is 1 (out of 1 available) 30582 1726855350.82292: exiting _queue_task() for managed_node3/debug 30582 1726855350.82303: done queuing things up, now waiting for results queue to drain 30582 1726855350.82305: waiting for pending results... 30582 1726855350.82562: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855350.82744: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019d2 30582 1726855350.82776: variable 'ansible_search_path' from source: unknown 30582 1726855350.82792: variable 'ansible_search_path' from source: unknown 30582 1726855350.82878: calling self._execute() 30582 1726855350.82945: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.82956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.82972: variable 'omit' from source: magic vars 30582 1726855350.83397: variable 'ansible_distribution_major_version' from source: facts 30582 1726855350.83423: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855350.83436: variable 'omit' from source: magic vars 30582 1726855350.83514: variable 'omit' from source: magic vars 30582 1726855350.83593: variable 'omit' from source: magic vars 30582 1726855350.83610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855350.83660: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855350.83689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855350.83714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855350.83731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855350.83859: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855350.83862: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.83867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.83915: Set connection var ansible_timeout to 10 30582 1726855350.83923: Set connection var ansible_connection to ssh 30582 1726855350.83937: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855350.83947: Set connection var ansible_pipelining to False 30582 1726855350.83962: Set connection var ansible_shell_executable to /bin/sh 30582 1726855350.83976: Set connection var ansible_shell_type to sh 30582 1726855350.84008: variable 'ansible_shell_executable' from source: unknown 30582 1726855350.84018: variable 'ansible_connection' from source: unknown 30582 1726855350.84076: variable 'ansible_module_compression' from source: unknown 30582 1726855350.84080: variable 'ansible_shell_type' from source: unknown 30582 1726855350.84083: variable 'ansible_shell_executable' from source: unknown 30582 1726855350.84085: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.84089: variable 'ansible_pipelining' from source: unknown 30582 1726855350.84091: variable 'ansible_timeout' from source: unknown 30582 1726855350.84094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.84296: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855350.84300: variable 'omit' from source: magic vars 30582 1726855350.84302: starting attempt loop 30582 1726855350.84304: running the handler 30582 1726855350.84330: variable '__network_connections_result' from source: set_fact 30582 1726855350.84435: variable '__network_connections_result' from source: set_fact 30582 1726855350.84559: handler run complete 30582 1726855350.84594: attempt loop complete, returning result 30582 1726855350.84602: _execute() done 30582 1726855350.84609: dumping result to json 30582 1726855350.84624: done dumping result, returning 30582 1726855350.84693: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-0000000019d2] 30582 1726855350.84696: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d2 30582 1726855350.84873: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d2 30582 1726855350.84877: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 07988b43-0bc6-4bfd-8ab8-3bff1d23cced skipped because already active" ] } } 30582 1726855350.84979: no more pending results, returning what we have 30582 1726855350.84984: results queue empty 30582 1726855350.84985: checking for any_errors_fatal 30582 1726855350.85099: done checking for any_errors_fatal 30582 1726855350.85101: checking for max_fail_percentage 30582 1726855350.85103: done checking for max_fail_percentage 30582 1726855350.85104: checking to see if all hosts have failed and the running result is not ok 30582 1726855350.85105: done checking to see if all hosts have failed 30582 1726855350.85106: getting the remaining hosts for this loop 30582 1726855350.85107: done getting the remaining hosts for this loop 30582 1726855350.85111: getting the next task for host managed_node3 30582 1726855350.85119: done getting next task for host managed_node3 30582 1726855350.85123: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855350.85128: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855350.85142: getting variables 30582 1726855350.85144: in VariableManager get_vars() 30582 1726855350.85295: Calling all_inventory to load vars for managed_node3 30582 1726855350.85300: Calling groups_inventory to load vars for managed_node3 30582 1726855350.85309: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855350.85319: Calling all_plugins_play to load vars for managed_node3 30582 1726855350.85323: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855350.85327: Calling groups_plugins_play to load vars for managed_node3 30582 1726855350.87111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855350.88735: done with get_vars() 30582 1726855350.88781: done getting variables 30582 1726855350.88839: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:30 -0400 (0:00:00.070) 0:01:27.238 ****** 30582 1726855350.88880: entering _queue_task() for managed_node3/debug 30582 1726855350.89250: worker is 1 (out of 1 available) 30582 1726855350.89267: exiting _queue_task() for managed_node3/debug 30582 1726855350.89281: done queuing things up, now waiting for results queue to drain 30582 1726855350.89283: waiting for pending results... 30582 1726855350.89631: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855350.89838: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019d3 30582 1726855350.89842: variable 'ansible_search_path' from source: unknown 30582 1726855350.89844: variable 'ansible_search_path' from source: unknown 30582 1726855350.89846: calling self._execute() 30582 1726855350.89908: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.89919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.89931: variable 'omit' from source: magic vars 30582 1726855350.90330: variable 'ansible_distribution_major_version' from source: facts 30582 1726855350.90346: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855350.90472: variable 'network_state' from source: role '' defaults 30582 1726855350.90491: Evaluated conditional (network_state != {}): False 30582 1726855350.90501: when evaluation is False, skipping this task 30582 1726855350.90508: _execute() done 30582 1726855350.90513: dumping result to json 30582 1726855350.90519: done dumping result, returning 30582 1726855350.90529: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-0000000019d3] 30582 1726855350.90537: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d3 30582 1726855350.90742: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d3 30582 1726855350.90745: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855350.90801: no more pending results, returning what we have 30582 1726855350.90805: results queue empty 30582 1726855350.90806: checking for any_errors_fatal 30582 1726855350.90817: done checking for any_errors_fatal 30582 1726855350.90818: checking for max_fail_percentage 30582 1726855350.90820: done checking for max_fail_percentage 30582 1726855350.90821: checking to see if all hosts have failed and the running result is not ok 30582 1726855350.90822: done checking to see if all hosts have failed 30582 1726855350.90823: getting the remaining hosts for this loop 30582 1726855350.90824: done getting the remaining hosts for this loop 30582 1726855350.90828: getting the next task for host managed_node3 30582 1726855350.90836: done getting next task for host managed_node3 30582 1726855350.90839: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855350.90846: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855350.91105: getting variables 30582 1726855350.91107: in VariableManager get_vars() 30582 1726855350.91144: Calling all_inventory to load vars for managed_node3 30582 1726855350.91147: Calling groups_inventory to load vars for managed_node3 30582 1726855350.91149: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855350.91159: Calling all_plugins_play to load vars for managed_node3 30582 1726855350.91162: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855350.91167: Calling groups_plugins_play to load vars for managed_node3 30582 1726855350.92472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855350.94120: done with get_vars() 30582 1726855350.94148: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:30 -0400 (0:00:00.053) 0:01:27.292 ****** 30582 1726855350.94253: entering _queue_task() for managed_node3/ping 30582 1726855350.94698: worker is 1 (out of 1 available) 30582 1726855350.94712: exiting _queue_task() for managed_node3/ping 30582 1726855350.94724: done queuing things up, now waiting for results queue to drain 30582 1726855350.94726: waiting for pending results... 30582 1726855350.95109: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855350.95137: in run() - task 0affcc66-ac2b-aa83-7d57-0000000019d4 30582 1726855350.95156: variable 'ansible_search_path' from source: unknown 30582 1726855350.95164: variable 'ansible_search_path' from source: unknown 30582 1726855350.95215: calling self._execute() 30582 1726855350.95331: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.95342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.95357: variable 'omit' from source: magic vars 30582 1726855350.95781: variable 'ansible_distribution_major_version' from source: facts 30582 1726855350.95860: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855350.95863: variable 'omit' from source: magic vars 30582 1726855350.95891: variable 'omit' from source: magic vars 30582 1726855350.95930: variable 'omit' from source: magic vars 30582 1726855350.95983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855350.96026: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855350.96051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855350.96084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855350.96104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855350.96185: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855350.96191: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.96194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.96274: Set connection var ansible_timeout to 10 30582 1726855350.96282: Set connection var ansible_connection to ssh 30582 1726855350.96305: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855350.96314: Set connection var ansible_pipelining to False 30582 1726855350.96324: Set connection var ansible_shell_executable to /bin/sh 30582 1726855350.96330: Set connection var ansible_shell_type to sh 30582 1726855350.96356: variable 'ansible_shell_executable' from source: unknown 30582 1726855350.96404: variable 'ansible_connection' from source: unknown 30582 1726855350.96407: variable 'ansible_module_compression' from source: unknown 30582 1726855350.96409: variable 'ansible_shell_type' from source: unknown 30582 1726855350.96411: variable 'ansible_shell_executable' from source: unknown 30582 1726855350.96413: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855350.96415: variable 'ansible_pipelining' from source: unknown 30582 1726855350.96417: variable 'ansible_timeout' from source: unknown 30582 1726855350.96419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855350.96626: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855350.96643: variable 'omit' from source: magic vars 30582 1726855350.96692: starting attempt loop 30582 1726855350.96695: running the handler 30582 1726855350.96698: _low_level_execute_command(): starting 30582 1726855350.96700: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855350.97454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855350.97472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855350.97497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855350.97613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855350.97641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855350.97734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855350.99429: stdout chunk (state=3): >>>/root <<< 30582 1726855350.99576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855350.99597: stderr chunk (state=3): >>><<< 30582 1726855350.99607: stdout chunk (state=3): >>><<< 30582 1726855350.99640: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855350.99692: _low_level_execute_command(): starting 30582 1726855350.99697: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345 `" && echo ansible-tmp-1726855350.996473-34680-225862417595345="` echo /root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345 `" ) && sleep 0' 30582 1726855351.00378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855351.00397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855351.00414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855351.00433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855351.00505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855351.00573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855351.00612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855351.00631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855351.00716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855351.02895: stdout chunk (state=3): >>>ansible-tmp-1726855350.996473-34680-225862417595345=/root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345 <<< 30582 1726855351.02974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855351.02977: stderr chunk (state=3): >>><<< 30582 1726855351.03054: stdout chunk (state=3): >>><<< 30582 1726855351.03058: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855350.996473-34680-225862417595345=/root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855351.03176: variable 'ansible_module_compression' from source: unknown 30582 1726855351.03330: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855351.03492: variable 'ansible_facts' from source: unknown 30582 1726855351.03495: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/AnsiballZ_ping.py 30582 1726855351.03732: Sending initial data 30582 1726855351.03741: Sent initial data (152 bytes) 30582 1726855351.04391: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855351.04495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855351.04516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855351.04533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855351.04577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855351.04667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855351.06501: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855351.06561: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpy8xtuihn /root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/AnsiballZ_ping.py <<< 30582 1726855351.06565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/AnsiballZ_ping.py" <<< 30582 1726855351.06650: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpy8xtuihn" to remote "/root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/AnsiballZ_ping.py" <<< 30582 1726855351.08474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855351.08479: stdout chunk (state=3): >>><<< 30582 1726855351.08694: stderr chunk (state=3): >>><<< 30582 1726855351.08698: done transferring module to remote 30582 1726855351.08700: _low_level_execute_command(): starting 30582 1726855351.08702: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/ /root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/AnsiballZ_ping.py && sleep 0' 30582 1726855351.09945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855351.09967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855351.09976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855351.10002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855351.10101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855351.12018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855351.12059: stderr chunk (state=3): >>><<< 30582 1726855351.12063: stdout chunk (state=3): >>><<< 30582 1726855351.12130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855351.12139: _low_level_execute_command(): starting 30582 1726855351.12142: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/AnsiballZ_ping.py && sleep 0' 30582 1726855351.13483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855351.14013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855351.14154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855351.14608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855351.29463: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855351.31196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855351.31200: stdout chunk (state=3): >>><<< 30582 1726855351.31202: stderr chunk (state=3): >>><<< 30582 1726855351.31296: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855351.31302: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855351.31304: _low_level_execute_command(): starting 30582 1726855351.31307: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855350.996473-34680-225862417595345/ > /dev/null 2>&1 && sleep 0' 30582 1726855351.32334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855351.32507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855351.32524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855351.32602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855351.32728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855351.32744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855351.32829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855351.34847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855351.34859: stdout chunk (state=3): >>><<< 30582 1726855351.34872: stderr chunk (state=3): >>><<< 30582 1726855351.34897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855351.35296: handler run complete 30582 1726855351.35300: attempt loop complete, returning result 30582 1726855351.35302: _execute() done 30582 1726855351.35304: dumping result to json 30582 1726855351.35306: done dumping result, returning 30582 1726855351.35308: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-0000000019d4] 30582 1726855351.35310: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d4 30582 1726855351.35382: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000019d4 30582 1726855351.35388: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855351.35516: no more pending results, returning what we have 30582 1726855351.35529: results queue empty 30582 1726855351.35535: checking for any_errors_fatal 30582 1726855351.35542: done checking for any_errors_fatal 30582 1726855351.35543: checking for max_fail_percentage 30582 1726855351.35545: done checking for max_fail_percentage 30582 1726855351.35546: checking to see if all hosts have failed and the running result is not ok 30582 1726855351.35546: done checking to see if all hosts have failed 30582 1726855351.35547: getting the remaining hosts for this loop 30582 1726855351.35549: done getting the remaining hosts for this loop 30582 1726855351.35553: getting the next task for host managed_node3 30582 1726855351.35564: done getting next task for host managed_node3 30582 1726855351.35568: ^ task is: TASK: meta (role_complete) 30582 1726855351.35572: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855351.35584: getting variables 30582 1726855351.35586: in VariableManager get_vars() 30582 1726855351.35629: Calling all_inventory to load vars for managed_node3 30582 1726855351.35631: Calling groups_inventory to load vars for managed_node3 30582 1726855351.35633: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855351.35642: Calling all_plugins_play to load vars for managed_node3 30582 1726855351.35644: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855351.35646: Calling groups_plugins_play to load vars for managed_node3 30582 1726855351.39416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855351.43015: done with get_vars() 30582 1726855351.43305: done getting variables 30582 1726855351.43397: done queuing things up, now waiting for results queue to drain 30582 1726855351.43400: results queue empty 30582 1726855351.43401: checking for any_errors_fatal 30582 1726855351.43404: done checking for any_errors_fatal 30582 1726855351.43405: checking for max_fail_percentage 30582 1726855351.43406: done checking for max_fail_percentage 30582 1726855351.43406: checking to see if all hosts have failed and the running result is not ok 30582 1726855351.43407: done checking to see if all hosts have failed 30582 1726855351.43408: getting the remaining hosts for this loop 30582 1726855351.43409: done getting the remaining hosts for this loop 30582 1726855351.43411: getting the next task for host managed_node3 30582 1726855351.43419: done getting next task for host managed_node3 30582 1726855351.43421: ^ task is: TASK: Include network role 30582 1726855351.43424: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855351.43427: getting variables 30582 1726855351.43428: in VariableManager get_vars() 30582 1726855351.43441: Calling all_inventory to load vars for managed_node3 30582 1726855351.43443: Calling groups_inventory to load vars for managed_node3 30582 1726855351.43445: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855351.43450: Calling all_plugins_play to load vars for managed_node3 30582 1726855351.43454: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855351.43457: Calling groups_plugins_play to load vars for managed_node3 30582 1726855351.46013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855351.49599: done with get_vars() 30582 1726855351.49633: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 14:02:31 -0400 (0:00:00.554) 0:01:27.847 ****** 30582 1726855351.49714: entering _queue_task() for managed_node3/include_role 30582 1726855351.50493: worker is 1 (out of 1 available) 30582 1726855351.50509: exiting _queue_task() for managed_node3/include_role 30582 1726855351.50524: done queuing things up, now waiting for results queue to drain 30582 1726855351.50526: waiting for pending results... 30582 1726855351.51015: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855351.51377: in run() - task 0affcc66-ac2b-aa83-7d57-0000000017d9 30582 1726855351.51394: variable 'ansible_search_path' from source: unknown 30582 1726855351.51399: variable 'ansible_search_path' from source: unknown 30582 1726855351.51433: calling self._execute() 30582 1726855351.51695: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855351.51698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855351.51709: variable 'omit' from source: magic vars 30582 1726855351.52695: variable 'ansible_distribution_major_version' from source: facts 30582 1726855351.52698: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855351.52701: _execute() done 30582 1726855351.52703: dumping result to json 30582 1726855351.52706: done dumping result, returning 30582 1726855351.52708: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-0000000017d9] 30582 1726855351.52710: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017d9 30582 1726855351.52998: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000017d9 30582 1726855351.53001: WORKER PROCESS EXITING 30582 1726855351.53042: no more pending results, returning what we have 30582 1726855351.53047: in VariableManager get_vars() 30582 1726855351.53099: Calling all_inventory to load vars for managed_node3 30582 1726855351.53103: Calling groups_inventory to load vars for managed_node3 30582 1726855351.53106: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855351.53118: Calling all_plugins_play to load vars for managed_node3 30582 1726855351.53121: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855351.53124: Calling groups_plugins_play to load vars for managed_node3 30582 1726855351.55107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855351.56672: done with get_vars() 30582 1726855351.56701: variable 'ansible_search_path' from source: unknown 30582 1726855351.56702: variable 'ansible_search_path' from source: unknown 30582 1726855351.56856: variable 'omit' from source: magic vars 30582 1726855351.57032: variable 'omit' from source: magic vars 30582 1726855351.57050: variable 'omit' from source: magic vars 30582 1726855351.57055: we have included files to process 30582 1726855351.57056: generating all_blocks data 30582 1726855351.57059: done generating all_blocks data 30582 1726855351.57064: processing included file: fedora.linux_system_roles.network 30582 1726855351.57084: in VariableManager get_vars() 30582 1726855351.57103: done with get_vars() 30582 1726855351.57201: in VariableManager get_vars() 30582 1726855351.57220: done with get_vars() 30582 1726855351.57326: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855351.57801: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855351.57882: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855351.58463: in VariableManager get_vars() 30582 1726855351.58489: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855351.60557: iterating over new_blocks loaded from include file 30582 1726855351.60560: in VariableManager get_vars() 30582 1726855351.60581: done with get_vars() 30582 1726855351.60583: filtering new block on tags 30582 1726855351.60899: done filtering new block on tags 30582 1726855351.60903: in VariableManager get_vars() 30582 1726855351.60920: done with get_vars() 30582 1726855351.60921: filtering new block on tags 30582 1726855351.60943: done filtering new block on tags 30582 1726855351.60945: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855351.60951: extending task lists for all hosts with included blocks 30582 1726855351.61077: done extending task lists 30582 1726855351.61079: done processing included files 30582 1726855351.61080: results queue empty 30582 1726855351.61081: checking for any_errors_fatal 30582 1726855351.61083: done checking for any_errors_fatal 30582 1726855351.61084: checking for max_fail_percentage 30582 1726855351.61085: done checking for max_fail_percentage 30582 1726855351.61086: checking to see if all hosts have failed and the running result is not ok 30582 1726855351.61088: done checking to see if all hosts have failed 30582 1726855351.61089: getting the remaining hosts for this loop 30582 1726855351.61090: done getting the remaining hosts for this loop 30582 1726855351.61093: getting the next task for host managed_node3 30582 1726855351.61099: done getting next task for host managed_node3 30582 1726855351.61102: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855351.61105: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855351.61118: getting variables 30582 1726855351.61119: in VariableManager get_vars() 30582 1726855351.61134: Calling all_inventory to load vars for managed_node3 30582 1726855351.61137: Calling groups_inventory to load vars for managed_node3 30582 1726855351.61139: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855351.61145: Calling all_plugins_play to load vars for managed_node3 30582 1726855351.61153: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855351.61157: Calling groups_plugins_play to load vars for managed_node3 30582 1726855351.62528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855351.64101: done with get_vars() 30582 1726855351.64133: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:31 -0400 (0:00:00.145) 0:01:27.992 ****** 30582 1726855351.64221: entering _queue_task() for managed_node3/include_tasks 30582 1726855351.64664: worker is 1 (out of 1 available) 30582 1726855351.64675: exiting _queue_task() for managed_node3/include_tasks 30582 1726855351.64686: done queuing things up, now waiting for results queue to drain 30582 1726855351.64744: waiting for pending results... 30582 1726855351.64967: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855351.65126: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b3b 30582 1726855351.65149: variable 'ansible_search_path' from source: unknown 30582 1726855351.65156: variable 'ansible_search_path' from source: unknown 30582 1726855351.65206: calling self._execute() 30582 1726855351.65395: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855351.65400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855351.65404: variable 'omit' from source: magic vars 30582 1726855351.65743: variable 'ansible_distribution_major_version' from source: facts 30582 1726855351.65759: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855351.65770: _execute() done 30582 1726855351.65777: dumping result to json 30582 1726855351.65784: done dumping result, returning 30582 1726855351.65798: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-000000001b3b] 30582 1726855351.65810: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3b 30582 1726855351.65938: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3b 30582 1726855351.66007: no more pending results, returning what we have 30582 1726855351.66014: in VariableManager get_vars() 30582 1726855351.66074: Calling all_inventory to load vars for managed_node3 30582 1726855351.66078: Calling groups_inventory to load vars for managed_node3 30582 1726855351.66081: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855351.66096: Calling all_plugins_play to load vars for managed_node3 30582 1726855351.66100: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855351.66103: Calling groups_plugins_play to load vars for managed_node3 30582 1726855351.66803: WORKER PROCESS EXITING 30582 1726855351.68650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855351.70640: done with get_vars() 30582 1726855351.70665: variable 'ansible_search_path' from source: unknown 30582 1726855351.70667: variable 'ansible_search_path' from source: unknown 30582 1726855351.70877: we have included files to process 30582 1726855351.70878: generating all_blocks data 30582 1726855351.70881: done generating all_blocks data 30582 1726855351.70885: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855351.70886: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855351.70891: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855351.72146: done processing included file 30582 1726855351.72148: iterating over new_blocks loaded from include file 30582 1726855351.72149: in VariableManager get_vars() 30582 1726855351.72296: done with get_vars() 30582 1726855351.72299: filtering new block on tags 30582 1726855351.72334: done filtering new block on tags 30582 1726855351.72338: in VariableManager get_vars() 30582 1726855351.72362: done with get_vars() 30582 1726855351.72364: filtering new block on tags 30582 1726855351.72518: done filtering new block on tags 30582 1726855351.72521: in VariableManager get_vars() 30582 1726855351.72545: done with get_vars() 30582 1726855351.72547: filtering new block on tags 30582 1726855351.72602: done filtering new block on tags 30582 1726855351.72605: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855351.72611: extending task lists for all hosts with included blocks 30582 1726855351.74469: done extending task lists 30582 1726855351.74471: done processing included files 30582 1726855351.74472: results queue empty 30582 1726855351.74473: checking for any_errors_fatal 30582 1726855351.74477: done checking for any_errors_fatal 30582 1726855351.74477: checking for max_fail_percentage 30582 1726855351.74479: done checking for max_fail_percentage 30582 1726855351.74480: checking to see if all hosts have failed and the running result is not ok 30582 1726855351.74480: done checking to see if all hosts have failed 30582 1726855351.74481: getting the remaining hosts for this loop 30582 1726855351.74483: done getting the remaining hosts for this loop 30582 1726855351.74486: getting the next task for host managed_node3 30582 1726855351.74492: done getting next task for host managed_node3 30582 1726855351.74495: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855351.74498: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855351.74561: getting variables 30582 1726855351.74563: in VariableManager get_vars() 30582 1726855351.74582: Calling all_inventory to load vars for managed_node3 30582 1726855351.74585: Calling groups_inventory to load vars for managed_node3 30582 1726855351.74644: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855351.74652: Calling all_plugins_play to load vars for managed_node3 30582 1726855351.74656: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855351.74659: Calling groups_plugins_play to load vars for managed_node3 30582 1726855351.76544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855351.79800: done with get_vars() 30582 1726855351.79833: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:02:31 -0400 (0:00:00.158) 0:01:28.150 ****** 30582 1726855351.80043: entering _queue_task() for managed_node3/setup 30582 1726855351.80729: worker is 1 (out of 1 available) 30582 1726855351.80859: exiting _queue_task() for managed_node3/setup 30582 1726855351.80872: done queuing things up, now waiting for results queue to drain 30582 1726855351.80874: waiting for pending results... 30582 1726855351.81598: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855351.82146: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b92 30582 1726855351.82160: variable 'ansible_search_path' from source: unknown 30582 1726855351.82164: variable 'ansible_search_path' from source: unknown 30582 1726855351.82205: calling self._execute() 30582 1726855351.82703: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855351.82707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855351.82718: variable 'omit' from source: magic vars 30582 1726855351.83932: variable 'ansible_distribution_major_version' from source: facts 30582 1726855351.83946: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855351.84594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855351.90729: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855351.91202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855351.91242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855351.91281: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855351.91309: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855351.91799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855351.91831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855351.91856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855351.91992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855351.91997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855351.92000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855351.92003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855351.92420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855351.92458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855351.92475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855351.93042: variable '__network_required_facts' from source: role '' defaults 30582 1726855351.93053: variable 'ansible_facts' from source: unknown 30582 1726855351.95830: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855351.95835: when evaluation is False, skipping this task 30582 1726855351.95837: _execute() done 30582 1726855351.95839: dumping result to json 30582 1726855351.95843: done dumping result, returning 30582 1726855351.95846: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-000000001b92] 30582 1726855351.95994: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b92 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855351.96158: no more pending results, returning what we have 30582 1726855351.96163: results queue empty 30582 1726855351.96164: checking for any_errors_fatal 30582 1726855351.96165: done checking for any_errors_fatal 30582 1726855351.96166: checking for max_fail_percentage 30582 1726855351.96168: done checking for max_fail_percentage 30582 1726855351.96170: checking to see if all hosts have failed and the running result is not ok 30582 1726855351.96170: done checking to see if all hosts have failed 30582 1726855351.96171: getting the remaining hosts for this loop 30582 1726855351.96173: done getting the remaining hosts for this loop 30582 1726855351.96177: getting the next task for host managed_node3 30582 1726855351.96190: done getting next task for host managed_node3 30582 1726855351.96194: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855351.96200: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855351.96397: getting variables 30582 1726855351.96399: in VariableManager get_vars() 30582 1726855351.96437: Calling all_inventory to load vars for managed_node3 30582 1726855351.96440: Calling groups_inventory to load vars for managed_node3 30582 1726855351.96442: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855351.96566: Calling all_plugins_play to load vars for managed_node3 30582 1726855351.96570: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855351.96574: Calling groups_plugins_play to load vars for managed_node3 30582 1726855351.97806: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b92 30582 1726855351.97815: WORKER PROCESS EXITING 30582 1726855352.01504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855352.06313: done with get_vars() 30582 1726855352.06351: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:02:32 -0400 (0:00:00.265) 0:01:28.415 ****** 30582 1726855352.06582: entering _queue_task() for managed_node3/stat 30582 1726855352.07464: worker is 1 (out of 1 available) 30582 1726855352.07478: exiting _queue_task() for managed_node3/stat 30582 1726855352.07600: done queuing things up, now waiting for results queue to drain 30582 1726855352.07602: waiting for pending results... 30582 1726855352.07998: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855352.08394: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b94 30582 1726855352.08570: variable 'ansible_search_path' from source: unknown 30582 1726855352.08580: variable 'ansible_search_path' from source: unknown 30582 1726855352.08624: calling self._execute() 30582 1726855352.08735: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855352.08797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855352.08864: variable 'omit' from source: magic vars 30582 1726855352.09660: variable 'ansible_distribution_major_version' from source: facts 30582 1726855352.09680: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855352.10025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855352.10744: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855352.10751: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855352.10790: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855352.10889: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855352.11128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855352.11149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855352.11208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855352.11263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855352.11502: variable '__network_is_ostree' from source: set_fact 30582 1726855352.11516: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855352.11612: when evaluation is False, skipping this task 30582 1726855352.11615: _execute() done 30582 1726855352.11618: dumping result to json 30582 1726855352.11620: done dumping result, returning 30582 1726855352.11623: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-000000001b94] 30582 1726855352.11625: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b94 30582 1726855352.11939: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b94 30582 1726855352.11943: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855352.12000: no more pending results, returning what we have 30582 1726855352.12005: results queue empty 30582 1726855352.12006: checking for any_errors_fatal 30582 1726855352.12015: done checking for any_errors_fatal 30582 1726855352.12016: checking for max_fail_percentage 30582 1726855352.12018: done checking for max_fail_percentage 30582 1726855352.12019: checking to see if all hosts have failed and the running result is not ok 30582 1726855352.12020: done checking to see if all hosts have failed 30582 1726855352.12021: getting the remaining hosts for this loop 30582 1726855352.12023: done getting the remaining hosts for this loop 30582 1726855352.12027: getting the next task for host managed_node3 30582 1726855352.12036: done getting next task for host managed_node3 30582 1726855352.12155: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855352.12162: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855352.12197: getting variables 30582 1726855352.12199: in VariableManager get_vars() 30582 1726855352.12248: Calling all_inventory to load vars for managed_node3 30582 1726855352.12251: Calling groups_inventory to load vars for managed_node3 30582 1726855352.12255: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855352.12502: Calling all_plugins_play to load vars for managed_node3 30582 1726855352.12507: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855352.12510: Calling groups_plugins_play to load vars for managed_node3 30582 1726855352.15662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855352.19546: done with get_vars() 30582 1726855352.19580: done getting variables 30582 1726855352.19753: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:02:32 -0400 (0:00:00.132) 0:01:28.547 ****** 30582 1726855352.19842: entering _queue_task() for managed_node3/set_fact 30582 1726855352.20825: worker is 1 (out of 1 available) 30582 1726855352.20835: exiting _queue_task() for managed_node3/set_fact 30582 1726855352.20846: done queuing things up, now waiting for results queue to drain 30582 1726855352.20848: waiting for pending results... 30582 1726855352.21242: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855352.21496: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b95 30582 1726855352.21663: variable 'ansible_search_path' from source: unknown 30582 1726855352.21667: variable 'ansible_search_path' from source: unknown 30582 1726855352.21671: calling self._execute() 30582 1726855352.21847: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855352.21891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855352.21941: variable 'omit' from source: magic vars 30582 1726855352.22794: variable 'ansible_distribution_major_version' from source: facts 30582 1726855352.22903: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855352.23145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855352.23768: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855352.23876: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855352.23972: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855352.24031: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855352.24220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855352.24251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855352.24292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855352.24372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855352.24453: variable '__network_is_ostree' from source: set_fact 30582 1726855352.24467: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855352.24480: when evaluation is False, skipping this task 30582 1726855352.24490: _execute() done 30582 1726855352.24497: dumping result to json 30582 1726855352.24542: done dumping result, returning 30582 1726855352.24546: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-000000001b95] 30582 1726855352.24549: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b95 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855352.24756: no more pending results, returning what we have 30582 1726855352.24760: results queue empty 30582 1726855352.24762: checking for any_errors_fatal 30582 1726855352.24770: done checking for any_errors_fatal 30582 1726855352.24771: checking for max_fail_percentage 30582 1726855352.24773: done checking for max_fail_percentage 30582 1726855352.24774: checking to see if all hosts have failed and the running result is not ok 30582 1726855352.24775: done checking to see if all hosts have failed 30582 1726855352.24775: getting the remaining hosts for this loop 30582 1726855352.24777: done getting the remaining hosts for this loop 30582 1726855352.24781: getting the next task for host managed_node3 30582 1726855352.24868: done getting next task for host managed_node3 30582 1726855352.24873: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855352.24879: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855352.24894: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b95 30582 1726855352.24898: WORKER PROCESS EXITING 30582 1726855352.24931: getting variables 30582 1726855352.24934: in VariableManager get_vars() 30582 1726855352.25141: Calling all_inventory to load vars for managed_node3 30582 1726855352.25145: Calling groups_inventory to load vars for managed_node3 30582 1726855352.25147: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855352.25157: Calling all_plugins_play to load vars for managed_node3 30582 1726855352.25161: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855352.25164: Calling groups_plugins_play to load vars for managed_node3 30582 1726855352.26944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855352.28647: done with get_vars() 30582 1726855352.28675: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:02:32 -0400 (0:00:00.089) 0:01:28.637 ****** 30582 1726855352.28786: entering _queue_task() for managed_node3/service_facts 30582 1726855352.29323: worker is 1 (out of 1 available) 30582 1726855352.29335: exiting _queue_task() for managed_node3/service_facts 30582 1726855352.29346: done queuing things up, now waiting for results queue to drain 30582 1726855352.29347: waiting for pending results... 30582 1726855352.29813: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855352.30052: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b97 30582 1726855352.30081: variable 'ansible_search_path' from source: unknown 30582 1726855352.30101: variable 'ansible_search_path' from source: unknown 30582 1726855352.30219: calling self._execute() 30582 1726855352.30329: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855352.30671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855352.30675: variable 'omit' from source: magic vars 30582 1726855352.31117: variable 'ansible_distribution_major_version' from source: facts 30582 1726855352.31135: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855352.31148: variable 'omit' from source: magic vars 30582 1726855352.31247: variable 'omit' from source: magic vars 30582 1726855352.31288: variable 'omit' from source: magic vars 30582 1726855352.31343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855352.31385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855352.31449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855352.31473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855352.31493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855352.31526: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855352.31542: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855352.31553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855352.31674: Set connection var ansible_timeout to 10 30582 1726855352.31683: Set connection var ansible_connection to ssh 30582 1726855352.31698: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855352.31708: Set connection var ansible_pipelining to False 30582 1726855352.31766: Set connection var ansible_shell_executable to /bin/sh 30582 1726855352.31769: Set connection var ansible_shell_type to sh 30582 1726855352.31771: variable 'ansible_shell_executable' from source: unknown 30582 1726855352.31774: variable 'ansible_connection' from source: unknown 30582 1726855352.31776: variable 'ansible_module_compression' from source: unknown 30582 1726855352.31778: variable 'ansible_shell_type' from source: unknown 30582 1726855352.31780: variable 'ansible_shell_executable' from source: unknown 30582 1726855352.31782: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855352.31789: variable 'ansible_pipelining' from source: unknown 30582 1726855352.31797: variable 'ansible_timeout' from source: unknown 30582 1726855352.31805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855352.32024: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855352.32042: variable 'omit' from source: magic vars 30582 1726855352.32100: starting attempt loop 30582 1726855352.32104: running the handler 30582 1726855352.32107: _low_level_execute_command(): starting 30582 1726855352.32109: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855352.32977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855352.33009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855352.33034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855352.33124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855352.34872: stdout chunk (state=3): >>>/root <<< 30582 1726855352.34999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855352.35133: stderr chunk (state=3): >>><<< 30582 1726855352.35137: stdout chunk (state=3): >>><<< 30582 1726855352.35154: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855352.35214: _low_level_execute_command(): starting 30582 1726855352.35227: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568 `" && echo ansible-tmp-1726855352.3520036-34729-238277762550568="` echo /root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568 `" ) && sleep 0' 30582 1726855352.36318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855352.36422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855352.36426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855352.36429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855352.36431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855352.36548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855352.36551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855352.36612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855352.36624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855352.36715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855352.36730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855352.38814: stdout chunk (state=3): >>>ansible-tmp-1726855352.3520036-34729-238277762550568=/root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568 <<< 30582 1726855352.39255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855352.39290: stderr chunk (state=3): >>><<< 30582 1726855352.39440: stdout chunk (state=3): >>><<< 30582 1726855352.39445: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855352.3520036-34729-238277762550568=/root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855352.39470: variable 'ansible_module_compression' from source: unknown 30582 1726855352.39585: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855352.39696: variable 'ansible_facts' from source: unknown 30582 1726855352.39917: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/AnsiballZ_service_facts.py 30582 1726855352.40298: Sending initial data 30582 1726855352.40302: Sent initial data (162 bytes) 30582 1726855352.41641: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855352.41693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855352.41765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855352.41780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855352.41883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855352.43580: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855352.43616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855352.43810: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpi6cf5mhf /root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/AnsiballZ_service_facts.py <<< 30582 1726855352.43815: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/AnsiballZ_service_facts.py" <<< 30582 1726855352.44002: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpi6cf5mhf" to remote "/root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/AnsiballZ_service_facts.py" <<< 30582 1726855352.45615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855352.45619: stderr chunk (state=3): >>><<< 30582 1726855352.45623: stdout chunk (state=3): >>><<< 30582 1726855352.46096: done transferring module to remote 30582 1726855352.46100: _low_level_execute_command(): starting 30582 1726855352.46103: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/ /root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/AnsiballZ_service_facts.py && sleep 0' 30582 1726855352.47086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855352.47217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855352.47307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855352.49215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855352.49243: stderr chunk (state=3): >>><<< 30582 1726855352.49252: stdout chunk (state=3): >>><<< 30582 1726855352.49271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855352.49307: _low_level_execute_command(): starting 30582 1726855352.49313: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/AnsiballZ_service_facts.py && sleep 0' 30582 1726855352.50370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855352.50374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855352.50377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855352.50449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855354.05397: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30582 1726855354.05496: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855354.07079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855354.07165: stderr chunk (state=3): >>><<< 30582 1726855354.07175: stdout chunk (state=3): >>><<< 30582 1726855354.07415: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855354.09111: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855354.09119: _low_level_execute_command(): starting 30582 1726855354.09245: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855352.3520036-34729-238277762550568/ > /dev/null 2>&1 && sleep 0' 30582 1726855354.10695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855354.10699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855354.10702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855354.10718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855354.10816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855354.12726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855354.12861: stderr chunk (state=3): >>><<< 30582 1726855354.12864: stdout chunk (state=3): >>><<< 30582 1726855354.12941: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855354.12952: handler run complete 30582 1726855354.13667: variable 'ansible_facts' from source: unknown 30582 1726855354.14208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855354.14730: variable 'ansible_facts' from source: unknown 30582 1726855354.14871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855354.15083: attempt loop complete, returning result 30582 1726855354.15104: _execute() done 30582 1726855354.15116: dumping result to json 30582 1726855354.15181: done dumping result, returning 30582 1726855354.15206: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000001b97] 30582 1726855354.15222: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b97 30582 1726855354.28320: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b97 30582 1726855354.28324: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855354.28451: no more pending results, returning what we have 30582 1726855354.28455: results queue empty 30582 1726855354.28456: checking for any_errors_fatal 30582 1726855354.28461: done checking for any_errors_fatal 30582 1726855354.28462: checking for max_fail_percentage 30582 1726855354.28466: done checking for max_fail_percentage 30582 1726855354.28467: checking to see if all hosts have failed and the running result is not ok 30582 1726855354.28468: done checking to see if all hosts have failed 30582 1726855354.28469: getting the remaining hosts for this loop 30582 1726855354.28470: done getting the remaining hosts for this loop 30582 1726855354.28474: getting the next task for host managed_node3 30582 1726855354.28481: done getting next task for host managed_node3 30582 1726855354.28484: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855354.28492: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855354.28505: getting variables 30582 1726855354.28506: in VariableManager get_vars() 30582 1726855354.28878: Calling all_inventory to load vars for managed_node3 30582 1726855354.28881: Calling groups_inventory to load vars for managed_node3 30582 1726855354.28884: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855354.28896: Calling all_plugins_play to load vars for managed_node3 30582 1726855354.28899: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855354.28902: Calling groups_plugins_play to load vars for managed_node3 30582 1726855354.39599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855354.41336: done with get_vars() 30582 1726855354.41403: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:02:34 -0400 (0:00:02.127) 0:01:30.765 ****** 30582 1726855354.41530: entering _queue_task() for managed_node3/package_facts 30582 1726855354.42329: worker is 1 (out of 1 available) 30582 1726855354.42343: exiting _queue_task() for managed_node3/package_facts 30582 1726855354.42472: done queuing things up, now waiting for results queue to drain 30582 1726855354.42475: waiting for pending results... 30582 1726855354.43007: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855354.43067: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b98 30582 1726855354.43072: variable 'ansible_search_path' from source: unknown 30582 1726855354.43076: variable 'ansible_search_path' from source: unknown 30582 1726855354.43194: calling self._execute() 30582 1726855354.43224: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855354.43230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855354.43565: variable 'omit' from source: magic vars 30582 1726855354.43711: variable 'ansible_distribution_major_version' from source: facts 30582 1726855354.43723: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855354.43730: variable 'omit' from source: magic vars 30582 1726855354.43819: variable 'omit' from source: magic vars 30582 1726855354.43857: variable 'omit' from source: magic vars 30582 1726855354.43908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855354.44094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855354.44098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855354.44100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855354.44103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855354.44107: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855354.44109: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855354.44113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855354.44177: Set connection var ansible_timeout to 10 30582 1726855354.44185: Set connection var ansible_connection to ssh 30582 1726855354.44201: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855354.44210: Set connection var ansible_pipelining to False 30582 1726855354.44230: Set connection var ansible_shell_executable to /bin/sh 30582 1726855354.44293: Set connection var ansible_shell_type to sh 30582 1726855354.44296: variable 'ansible_shell_executable' from source: unknown 30582 1726855354.44298: variable 'ansible_connection' from source: unknown 30582 1726855354.44300: variable 'ansible_module_compression' from source: unknown 30582 1726855354.44301: variable 'ansible_shell_type' from source: unknown 30582 1726855354.44304: variable 'ansible_shell_executable' from source: unknown 30582 1726855354.44305: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855354.44307: variable 'ansible_pipelining' from source: unknown 30582 1726855354.44308: variable 'ansible_timeout' from source: unknown 30582 1726855354.44310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855354.44523: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855354.44542: variable 'omit' from source: magic vars 30582 1726855354.44562: starting attempt loop 30582 1726855354.44574: running the handler 30582 1726855354.44599: _low_level_execute_command(): starting 30582 1726855354.44613: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855354.45495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855354.45513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855354.45555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855354.45578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855354.45589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855354.45681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855354.47401: stdout chunk (state=3): >>>/root <<< 30582 1726855354.47716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855354.47720: stdout chunk (state=3): >>><<< 30582 1726855354.47723: stderr chunk (state=3): >>><<< 30582 1726855354.47726: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855354.47729: _low_level_execute_command(): starting 30582 1726855354.47732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262 `" && echo ansible-tmp-1726855354.4759078-34821-29618062368262="` echo /root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262 `" ) && sleep 0' 30582 1726855354.48291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855354.48307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855354.48318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855354.48345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855354.48405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855354.48461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855354.48478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855354.48498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855354.48586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855354.50497: stdout chunk (state=3): >>>ansible-tmp-1726855354.4759078-34821-29618062368262=/root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262 <<< 30582 1726855354.50661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855354.50667: stdout chunk (state=3): >>><<< 30582 1726855354.50671: stderr chunk (state=3): >>><<< 30582 1726855354.50893: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855354.4759078-34821-29618062368262=/root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855354.50897: variable 'ansible_module_compression' from source: unknown 30582 1726855354.50899: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855354.50902: variable 'ansible_facts' from source: unknown 30582 1726855354.51083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/AnsiballZ_package_facts.py 30582 1726855354.51258: Sending initial data 30582 1726855354.51271: Sent initial data (161 bytes) 30582 1726855354.51954: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855354.51969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855354.51985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855354.52021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855354.52133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855354.52153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855354.52261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855354.53872: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855354.53947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855354.54004: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp0jzx57it /root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/AnsiballZ_package_facts.py <<< 30582 1726855354.54007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/AnsiballZ_package_facts.py" <<< 30582 1726855354.54061: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp0jzx57it" to remote "/root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/AnsiballZ_package_facts.py" <<< 30582 1726855354.55306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855354.55366: stderr chunk (state=3): >>><<< 30582 1726855354.55369: stdout chunk (state=3): >>><<< 30582 1726855354.55389: done transferring module to remote 30582 1726855354.55396: _low_level_execute_command(): starting 30582 1726855354.55401: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/ /root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/AnsiballZ_package_facts.py && sleep 0' 30582 1726855354.56071: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855354.56098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855354.56144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855354.56220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855354.56251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855354.56347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855354.58135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855354.58158: stderr chunk (state=3): >>><<< 30582 1726855354.58162: stdout chunk (state=3): >>><<< 30582 1726855354.58182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855354.58185: _low_level_execute_command(): starting 30582 1726855354.58194: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/AnsiballZ_package_facts.py && sleep 0' 30582 1726855354.58627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855354.58631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855354.58633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855354.58636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855354.58679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855354.58683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855354.58753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855355.03721: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30582 1726855355.03742: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855355.05422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855355.05426: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855355.05428: stderr chunk (state=3): >>><<< 30582 1726855355.05510: stdout chunk (state=3): >>><<< 30582 1726855355.05610: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855355.08785: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855355.08890: _low_level_execute_command(): starting 30582 1726855355.08901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855354.4759078-34821-29618062368262/ > /dev/null 2>&1 && sleep 0' 30582 1726855355.10225: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855355.10335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855355.10338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855355.10405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855355.10546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855355.12404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855355.12529: stderr chunk (state=3): >>><<< 30582 1726855355.12532: stdout chunk (state=3): >>><<< 30582 1726855355.12535: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855355.12537: handler run complete 30582 1726855355.14426: variable 'ansible_facts' from source: unknown 30582 1726855355.15601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.20397: variable 'ansible_facts' from source: unknown 30582 1726855355.21312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.22353: attempt loop complete, returning result 30582 1726855355.22383: _execute() done 30582 1726855355.22394: dumping result to json 30582 1726855355.22626: done dumping result, returning 30582 1726855355.22643: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000001b98] 30582 1726855355.22659: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b98 30582 1726855355.25954: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b98 30582 1726855355.25958: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855355.26124: no more pending results, returning what we have 30582 1726855355.26127: results queue empty 30582 1726855355.26128: checking for any_errors_fatal 30582 1726855355.26136: done checking for any_errors_fatal 30582 1726855355.26137: checking for max_fail_percentage 30582 1726855355.26138: done checking for max_fail_percentage 30582 1726855355.26139: checking to see if all hosts have failed and the running result is not ok 30582 1726855355.26140: done checking to see if all hosts have failed 30582 1726855355.26141: getting the remaining hosts for this loop 30582 1726855355.26142: done getting the remaining hosts for this loop 30582 1726855355.26145: getting the next task for host managed_node3 30582 1726855355.26153: done getting next task for host managed_node3 30582 1726855355.26156: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855355.26162: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855355.26178: getting variables 30582 1726855355.26180: in VariableManager get_vars() 30582 1726855355.26216: Calling all_inventory to load vars for managed_node3 30582 1726855355.26219: Calling groups_inventory to load vars for managed_node3 30582 1726855355.26221: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855355.26230: Calling all_plugins_play to load vars for managed_node3 30582 1726855355.26233: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855355.26236: Calling groups_plugins_play to load vars for managed_node3 30582 1726855355.28060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.30084: done with get_vars() 30582 1726855355.30326: done getting variables 30582 1726855355.30394: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:35 -0400 (0:00:00.889) 0:01:31.654 ****** 30582 1726855355.30437: entering _queue_task() for managed_node3/debug 30582 1726855355.31230: worker is 1 (out of 1 available) 30582 1726855355.31246: exiting _queue_task() for managed_node3/debug 30582 1726855355.31259: done queuing things up, now waiting for results queue to drain 30582 1726855355.31261: waiting for pending results... 30582 1726855355.31830: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855355.32041: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b3c 30582 1726855355.32142: variable 'ansible_search_path' from source: unknown 30582 1726855355.32145: variable 'ansible_search_path' from source: unknown 30582 1726855355.32215: calling self._execute() 30582 1726855355.32433: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.32437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.32448: variable 'omit' from source: magic vars 30582 1726855355.33403: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.33422: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855355.33425: variable 'omit' from source: magic vars 30582 1726855355.33596: variable 'omit' from source: magic vars 30582 1726855355.33705: variable 'network_provider' from source: set_fact 30582 1726855355.33841: variable 'omit' from source: magic vars 30582 1726855355.33882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855355.33920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855355.34012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855355.34075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855355.34079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855355.34081: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855355.34083: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.34085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.34377: Set connection var ansible_timeout to 10 30582 1726855355.34380: Set connection var ansible_connection to ssh 30582 1726855355.34511: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855355.34515: Set connection var ansible_pipelining to False 30582 1726855355.34517: Set connection var ansible_shell_executable to /bin/sh 30582 1726855355.34520: Set connection var ansible_shell_type to sh 30582 1726855355.34522: variable 'ansible_shell_executable' from source: unknown 30582 1726855355.34524: variable 'ansible_connection' from source: unknown 30582 1726855355.34526: variable 'ansible_module_compression' from source: unknown 30582 1726855355.34528: variable 'ansible_shell_type' from source: unknown 30582 1726855355.34530: variable 'ansible_shell_executable' from source: unknown 30582 1726855355.34532: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.34534: variable 'ansible_pipelining' from source: unknown 30582 1726855355.34536: variable 'ansible_timeout' from source: unknown 30582 1726855355.34538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.34819: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855355.34833: variable 'omit' from source: magic vars 30582 1726855355.34839: starting attempt loop 30582 1726855355.34842: running the handler 30582 1726855355.34885: handler run complete 30582 1726855355.35032: attempt loop complete, returning result 30582 1726855355.35036: _execute() done 30582 1726855355.35039: dumping result to json 30582 1726855355.35042: done dumping result, returning 30582 1726855355.35093: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-000000001b3c] 30582 1726855355.35097: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3c ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855355.35382: no more pending results, returning what we have 30582 1726855355.35386: results queue empty 30582 1726855355.35389: checking for any_errors_fatal 30582 1726855355.35399: done checking for any_errors_fatal 30582 1726855355.35399: checking for max_fail_percentage 30582 1726855355.35401: done checking for max_fail_percentage 30582 1726855355.35402: checking to see if all hosts have failed and the running result is not ok 30582 1726855355.35403: done checking to see if all hosts have failed 30582 1726855355.35403: getting the remaining hosts for this loop 30582 1726855355.35405: done getting the remaining hosts for this loop 30582 1726855355.35408: getting the next task for host managed_node3 30582 1726855355.35417: done getting next task for host managed_node3 30582 1726855355.35420: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855355.35425: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855355.35440: getting variables 30582 1726855355.35441: in VariableManager get_vars() 30582 1726855355.35606: Calling all_inventory to load vars for managed_node3 30582 1726855355.35609: Calling groups_inventory to load vars for managed_node3 30582 1726855355.35612: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855355.35622: Calling all_plugins_play to load vars for managed_node3 30582 1726855355.35625: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855355.35627: Calling groups_plugins_play to load vars for managed_node3 30582 1726855355.36246: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3c 30582 1726855355.36250: WORKER PROCESS EXITING 30582 1726855355.38594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.40562: done with get_vars() 30582 1726855355.40810: done getting variables 30582 1726855355.40872: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:35 -0400 (0:00:00.104) 0:01:31.759 ****** 30582 1726855355.40917: entering _queue_task() for managed_node3/fail 30582 1726855355.41697: worker is 1 (out of 1 available) 30582 1726855355.41711: exiting _queue_task() for managed_node3/fail 30582 1726855355.41723: done queuing things up, now waiting for results queue to drain 30582 1726855355.41725: waiting for pending results... 30582 1726855355.42330: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855355.42560: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b3d 30582 1726855355.42574: variable 'ansible_search_path' from source: unknown 30582 1726855355.42577: variable 'ansible_search_path' from source: unknown 30582 1726855355.42718: calling self._execute() 30582 1726855355.42862: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.42868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.42875: variable 'omit' from source: magic vars 30582 1726855355.43603: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.43703: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855355.43746: variable 'network_state' from source: role '' defaults 30582 1726855355.43757: Evaluated conditional (network_state != {}): False 30582 1726855355.43761: when evaluation is False, skipping this task 30582 1726855355.43766: _execute() done 30582 1726855355.43770: dumping result to json 30582 1726855355.43772: done dumping result, returning 30582 1726855355.43775: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-000000001b3d] 30582 1726855355.43782: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3d 30582 1726855355.43895: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3d 30582 1726855355.43898: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855355.43976: no more pending results, returning what we have 30582 1726855355.43981: results queue empty 30582 1726855355.43982: checking for any_errors_fatal 30582 1726855355.43993: done checking for any_errors_fatal 30582 1726855355.43994: checking for max_fail_percentage 30582 1726855355.43996: done checking for max_fail_percentage 30582 1726855355.43997: checking to see if all hosts have failed and the running result is not ok 30582 1726855355.43997: done checking to see if all hosts have failed 30582 1726855355.43998: getting the remaining hosts for this loop 30582 1726855355.43999: done getting the remaining hosts for this loop 30582 1726855355.44003: getting the next task for host managed_node3 30582 1726855355.44011: done getting next task for host managed_node3 30582 1726855355.44015: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855355.44020: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855355.44114: getting variables 30582 1726855355.44116: in VariableManager get_vars() 30582 1726855355.44515: Calling all_inventory to load vars for managed_node3 30582 1726855355.44519: Calling groups_inventory to load vars for managed_node3 30582 1726855355.44521: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855355.44532: Calling all_plugins_play to load vars for managed_node3 30582 1726855355.44535: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855355.44538: Calling groups_plugins_play to load vars for managed_node3 30582 1726855355.47259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.48815: done with get_vars() 30582 1726855355.48853: done getting variables 30582 1726855355.48957: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:35 -0400 (0:00:00.080) 0:01:31.839 ****** 30582 1726855355.49009: entering _queue_task() for managed_node3/fail 30582 1726855355.49905: worker is 1 (out of 1 available) 30582 1726855355.49917: exiting _queue_task() for managed_node3/fail 30582 1726855355.49929: done queuing things up, now waiting for results queue to drain 30582 1726855355.49931: waiting for pending results... 30582 1726855355.50201: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855355.50222: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b3e 30582 1726855355.50236: variable 'ansible_search_path' from source: unknown 30582 1726855355.50240: variable 'ansible_search_path' from source: unknown 30582 1726855355.50328: calling self._execute() 30582 1726855355.50547: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.50551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.50554: variable 'omit' from source: magic vars 30582 1726855355.50859: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.50873: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855355.51015: variable 'network_state' from source: role '' defaults 30582 1726855355.51025: Evaluated conditional (network_state != {}): False 30582 1726855355.51028: when evaluation is False, skipping this task 30582 1726855355.51030: _execute() done 30582 1726855355.51048: dumping result to json 30582 1726855355.51051: done dumping result, returning 30582 1726855355.51060: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-000000001b3e] 30582 1726855355.51065: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3e skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855355.51216: no more pending results, returning what we have 30582 1726855355.51221: results queue empty 30582 1726855355.51222: checking for any_errors_fatal 30582 1726855355.51231: done checking for any_errors_fatal 30582 1726855355.51232: checking for max_fail_percentage 30582 1726855355.51234: done checking for max_fail_percentage 30582 1726855355.51235: checking to see if all hosts have failed and the running result is not ok 30582 1726855355.51236: done checking to see if all hosts have failed 30582 1726855355.51237: getting the remaining hosts for this loop 30582 1726855355.51238: done getting the remaining hosts for this loop 30582 1726855355.51243: getting the next task for host managed_node3 30582 1726855355.51266: done getting next task for host managed_node3 30582 1726855355.51271: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855355.51278: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855355.51318: getting variables 30582 1726855355.51320: in VariableManager get_vars() 30582 1726855355.51490: Calling all_inventory to load vars for managed_node3 30582 1726855355.51493: Calling groups_inventory to load vars for managed_node3 30582 1726855355.51496: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855355.51511: Calling all_plugins_play to load vars for managed_node3 30582 1726855355.51515: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855355.51519: Calling groups_plugins_play to load vars for managed_node3 30582 1726855355.52038: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3e 30582 1726855355.52042: WORKER PROCESS EXITING 30582 1726855355.53179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.54561: done with get_vars() 30582 1726855355.54588: done getting variables 30582 1726855355.54637: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:35 -0400 (0:00:00.056) 0:01:31.896 ****** 30582 1726855355.54666: entering _queue_task() for managed_node3/fail 30582 1726855355.54954: worker is 1 (out of 1 available) 30582 1726855355.54988: exiting _queue_task() for managed_node3/fail 30582 1726855355.55000: done queuing things up, now waiting for results queue to drain 30582 1726855355.55002: waiting for pending results... 30582 1726855355.55314: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855355.55384: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b3f 30582 1726855355.55412: variable 'ansible_search_path' from source: unknown 30582 1726855355.55429: variable 'ansible_search_path' from source: unknown 30582 1726855355.55474: calling self._execute() 30582 1726855355.55593: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.55605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.55619: variable 'omit' from source: magic vars 30582 1726855355.56209: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.56297: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855355.56410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855355.58543: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855355.58603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855355.58632: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855355.58661: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855355.58684: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855355.58746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.58783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.58804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.58830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.58842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.58935: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.58949: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855355.59043: variable 'ansible_distribution' from source: facts 30582 1726855355.59047: variable '__network_rh_distros' from source: role '' defaults 30582 1726855355.59079: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855355.59321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.59335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.59358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.59409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.59413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.59456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.59480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.59517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.59592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.59595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.59598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.59621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.59645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.59683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.59699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.60034: variable 'network_connections' from source: include params 30582 1726855355.60040: variable 'interface' from source: play vars 30582 1726855355.60105: variable 'interface' from source: play vars 30582 1726855355.60116: variable 'network_state' from source: role '' defaults 30582 1726855355.60183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855355.60417: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855355.60460: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855355.60512: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855355.60546: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855355.60593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855355.60632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855355.60671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.60707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855355.60745: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855355.60754: when evaluation is False, skipping this task 30582 1726855355.60819: _execute() done 30582 1726855355.60824: dumping result to json 30582 1726855355.60827: done dumping result, returning 30582 1726855355.60830: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-000000001b3f] 30582 1726855355.60832: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3f 30582 1726855355.60901: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b3f 30582 1726855355.60904: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855355.60961: no more pending results, returning what we have 30582 1726855355.60965: results queue empty 30582 1726855355.60966: checking for any_errors_fatal 30582 1726855355.60975: done checking for any_errors_fatal 30582 1726855355.60976: checking for max_fail_percentage 30582 1726855355.60978: done checking for max_fail_percentage 30582 1726855355.60979: checking to see if all hosts have failed and the running result is not ok 30582 1726855355.60980: done checking to see if all hosts have failed 30582 1726855355.60981: getting the remaining hosts for this loop 30582 1726855355.60983: done getting the remaining hosts for this loop 30582 1726855355.60990: getting the next task for host managed_node3 30582 1726855355.61000: done getting next task for host managed_node3 30582 1726855355.61004: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855355.61010: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855355.61042: getting variables 30582 1726855355.61044: in VariableManager get_vars() 30582 1726855355.61305: Calling all_inventory to load vars for managed_node3 30582 1726855355.61309: Calling groups_inventory to load vars for managed_node3 30582 1726855355.61312: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855355.61324: Calling all_plugins_play to load vars for managed_node3 30582 1726855355.61327: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855355.61331: Calling groups_plugins_play to load vars for managed_node3 30582 1726855355.62414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.63313: done with get_vars() 30582 1726855355.63334: done getting variables 30582 1726855355.63386: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:35 -0400 (0:00:00.087) 0:01:31.984 ****** 30582 1726855355.63412: entering _queue_task() for managed_node3/dnf 30582 1726855355.63704: worker is 1 (out of 1 available) 30582 1726855355.63719: exiting _queue_task() for managed_node3/dnf 30582 1726855355.63729: done queuing things up, now waiting for results queue to drain 30582 1726855355.63731: waiting for pending results... 30582 1726855355.63966: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855355.64193: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b40 30582 1726855355.64196: variable 'ansible_search_path' from source: unknown 30582 1726855355.64199: variable 'ansible_search_path' from source: unknown 30582 1726855355.64202: calling self._execute() 30582 1726855355.64269: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.64280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.64308: variable 'omit' from source: magic vars 30582 1726855355.64719: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.64739: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855355.64967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855355.67296: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855355.67340: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855355.67367: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855355.67398: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855355.67421: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855355.67483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.67506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.67527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.67553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.67564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.67656: variable 'ansible_distribution' from source: facts 30582 1726855355.67660: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.67677: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855355.67759: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855355.67849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.67865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.67884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.67910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.67921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.67952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.67972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.67989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.68013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.68024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.68052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.68072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.68091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.68119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.68129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.68234: variable 'network_connections' from source: include params 30582 1726855355.68244: variable 'interface' from source: play vars 30582 1726855355.68301: variable 'interface' from source: play vars 30582 1726855355.68349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855355.68485: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855355.68518: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855355.68541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855355.68561: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855355.68597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855355.68614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855355.68635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.68655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855355.68694: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855355.68844: variable 'network_connections' from source: include params 30582 1726855355.68847: variable 'interface' from source: play vars 30582 1726855355.68895: variable 'interface' from source: play vars 30582 1726855355.68914: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855355.68917: when evaluation is False, skipping this task 30582 1726855355.68920: _execute() done 30582 1726855355.68922: dumping result to json 30582 1726855355.68925: done dumping result, returning 30582 1726855355.68933: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001b40] 30582 1726855355.68938: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b40 30582 1726855355.69029: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b40 30582 1726855355.69032: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855355.69118: no more pending results, returning what we have 30582 1726855355.69122: results queue empty 30582 1726855355.69123: checking for any_errors_fatal 30582 1726855355.69129: done checking for any_errors_fatal 30582 1726855355.69129: checking for max_fail_percentage 30582 1726855355.69131: done checking for max_fail_percentage 30582 1726855355.69132: checking to see if all hosts have failed and the running result is not ok 30582 1726855355.69133: done checking to see if all hosts have failed 30582 1726855355.69133: getting the remaining hosts for this loop 30582 1726855355.69135: done getting the remaining hosts for this loop 30582 1726855355.69138: getting the next task for host managed_node3 30582 1726855355.69153: done getting next task for host managed_node3 30582 1726855355.69157: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855355.69162: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855355.69193: getting variables 30582 1726855355.69195: in VariableManager get_vars() 30582 1726855355.69235: Calling all_inventory to load vars for managed_node3 30582 1726855355.69237: Calling groups_inventory to load vars for managed_node3 30582 1726855355.69239: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855355.69258: Calling all_plugins_play to load vars for managed_node3 30582 1726855355.69261: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855355.69265: Calling groups_plugins_play to load vars for managed_node3 30582 1726855355.70713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.71604: done with get_vars() 30582 1726855355.71622: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855355.71680: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:35 -0400 (0:00:00.082) 0:01:32.066 ****** 30582 1726855355.71707: entering _queue_task() for managed_node3/yum 30582 1726855355.71970: worker is 1 (out of 1 available) 30582 1726855355.71985: exiting _queue_task() for managed_node3/yum 30582 1726855355.71998: done queuing things up, now waiting for results queue to drain 30582 1726855355.72000: waiting for pending results... 30582 1726855355.72247: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855355.72282: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b41 30582 1726855355.72296: variable 'ansible_search_path' from source: unknown 30582 1726855355.72299: variable 'ansible_search_path' from source: unknown 30582 1726855355.72327: calling self._execute() 30582 1726855355.72402: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.72405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.72414: variable 'omit' from source: magic vars 30582 1726855355.72698: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.72707: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855355.72830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855355.74419: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855355.74461: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855355.74491: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855355.74521: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855355.74540: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855355.74601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.74633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.74655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.74682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.74696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.74771: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.74785: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855355.74789: when evaluation is False, skipping this task 30582 1726855355.74792: _execute() done 30582 1726855355.74794: dumping result to json 30582 1726855355.74797: done dumping result, returning 30582 1726855355.74805: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001b41] 30582 1726855355.74810: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b41 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855355.74963: no more pending results, returning what we have 30582 1726855355.74969: results queue empty 30582 1726855355.74970: checking for any_errors_fatal 30582 1726855355.74976: done checking for any_errors_fatal 30582 1726855355.74977: checking for max_fail_percentage 30582 1726855355.74979: done checking for max_fail_percentage 30582 1726855355.74980: checking to see if all hosts have failed and the running result is not ok 30582 1726855355.74980: done checking to see if all hosts have failed 30582 1726855355.74981: getting the remaining hosts for this loop 30582 1726855355.74982: done getting the remaining hosts for this loop 30582 1726855355.74986: getting the next task for host managed_node3 30582 1726855355.75000: done getting next task for host managed_node3 30582 1726855355.75005: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855355.75009: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855355.75040: getting variables 30582 1726855355.75041: in VariableManager get_vars() 30582 1726855355.75085: Calling all_inventory to load vars for managed_node3 30582 1726855355.75097: Calling groups_inventory to load vars for managed_node3 30582 1726855355.75100: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855355.75106: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b41 30582 1726855355.75108: WORKER PROCESS EXITING 30582 1726855355.75118: Calling all_plugins_play to load vars for managed_node3 30582 1726855355.75121: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855355.75123: Calling groups_plugins_play to load vars for managed_node3 30582 1726855355.75978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.77017: done with get_vars() 30582 1726855355.77035: done getting variables 30582 1726855355.77083: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:35 -0400 (0:00:00.054) 0:01:32.121 ****** 30582 1726855355.77112: entering _queue_task() for managed_node3/fail 30582 1726855355.77381: worker is 1 (out of 1 available) 30582 1726855355.77398: exiting _queue_task() for managed_node3/fail 30582 1726855355.77410: done queuing things up, now waiting for results queue to drain 30582 1726855355.77411: waiting for pending results... 30582 1726855355.77610: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855355.77704: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b42 30582 1726855355.77716: variable 'ansible_search_path' from source: unknown 30582 1726855355.77719: variable 'ansible_search_path' from source: unknown 30582 1726855355.77750: calling self._execute() 30582 1726855355.77831: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.77835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.77843: variable 'omit' from source: magic vars 30582 1726855355.78137: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.78145: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855355.78237: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855355.78375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855355.79941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855355.79989: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855355.80017: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855355.80044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855355.80066: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855355.80129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.80164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.80184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.80212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.80223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.80260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.80279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.80298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.80322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.80332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.80361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.80382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.80400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.80424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.80434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.80581: variable 'network_connections' from source: include params 30582 1726855355.80597: variable 'interface' from source: play vars 30582 1726855355.80646: variable 'interface' from source: play vars 30582 1726855355.80703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855355.80812: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855355.80838: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855355.80860: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855355.80883: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855355.80920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855355.80933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855355.80951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.80969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855355.81009: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855355.81170: variable 'network_connections' from source: include params 30582 1726855355.81173: variable 'interface' from source: play vars 30582 1726855355.81216: variable 'interface' from source: play vars 30582 1726855355.81238: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855355.81242: when evaluation is False, skipping this task 30582 1726855355.81244: _execute() done 30582 1726855355.81247: dumping result to json 30582 1726855355.81250: done dumping result, returning 30582 1726855355.81252: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001b42] 30582 1726855355.81261: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b42 30582 1726855355.81350: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b42 30582 1726855355.81353: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855355.81416: no more pending results, returning what we have 30582 1726855355.81420: results queue empty 30582 1726855355.81421: checking for any_errors_fatal 30582 1726855355.81430: done checking for any_errors_fatal 30582 1726855355.81431: checking for max_fail_percentage 30582 1726855355.81433: done checking for max_fail_percentage 30582 1726855355.81434: checking to see if all hosts have failed and the running result is not ok 30582 1726855355.81434: done checking to see if all hosts have failed 30582 1726855355.81435: getting the remaining hosts for this loop 30582 1726855355.81437: done getting the remaining hosts for this loop 30582 1726855355.81441: getting the next task for host managed_node3 30582 1726855355.81449: done getting next task for host managed_node3 30582 1726855355.81453: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855355.81458: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855355.81498: getting variables 30582 1726855355.81500: in VariableManager get_vars() 30582 1726855355.81541: Calling all_inventory to load vars for managed_node3 30582 1726855355.81544: Calling groups_inventory to load vars for managed_node3 30582 1726855355.81546: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855355.81555: Calling all_plugins_play to load vars for managed_node3 30582 1726855355.81558: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855355.81560: Calling groups_plugins_play to load vars for managed_node3 30582 1726855355.82823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.83824: done with get_vars() 30582 1726855355.83853: done getting variables 30582 1726855355.83902: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:35 -0400 (0:00:00.068) 0:01:32.189 ****** 30582 1726855355.83929: entering _queue_task() for managed_node3/package 30582 1726855355.84222: worker is 1 (out of 1 available) 30582 1726855355.84234: exiting _queue_task() for managed_node3/package 30582 1726855355.84248: done queuing things up, now waiting for results queue to drain 30582 1726855355.84249: waiting for pending results... 30582 1726855355.84466: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855355.84588: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b43 30582 1726855355.84601: variable 'ansible_search_path' from source: unknown 30582 1726855355.84605: variable 'ansible_search_path' from source: unknown 30582 1726855355.84647: calling self._execute() 30582 1726855355.84735: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855355.84738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855355.84747: variable 'omit' from source: magic vars 30582 1726855355.85200: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.85203: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855355.85338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855355.85769: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855355.85838: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855355.85889: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855355.85984: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855355.86116: variable 'network_packages' from source: role '' defaults 30582 1726855355.86238: variable '__network_provider_setup' from source: role '' defaults 30582 1726855355.86255: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855355.86330: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855355.86343: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855355.86506: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855355.86615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855355.88795: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855355.88870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855355.88923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855355.88959: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855355.88993: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855355.89094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.89193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.89197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.89217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.89393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.89396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.89399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.89401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.89402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.89404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.89661: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855355.89794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.89825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.89867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.89912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.89931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.90034: variable 'ansible_python' from source: facts 30582 1726855355.90058: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855355.90154: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855355.90257: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855355.90379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.90423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.90441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.90507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.90510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.90800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855355.90812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855355.90815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.90848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855355.90870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855355.91214: variable 'network_connections' from source: include params 30582 1726855355.91234: variable 'interface' from source: play vars 30582 1726855355.91458: variable 'interface' from source: play vars 30582 1726855355.92519: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855355.92683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855355.92719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855355.92895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855355.92898: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855355.93636: variable 'network_connections' from source: include params 30582 1726855355.93723: variable 'interface' from source: play vars 30582 1726855355.93945: variable 'interface' from source: play vars 30582 1726855355.94050: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855355.94216: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855355.94785: variable 'network_connections' from source: include params 30582 1726855355.94801: variable 'interface' from source: play vars 30582 1726855355.94858: variable 'interface' from source: play vars 30582 1726855355.94886: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855355.94971: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855355.95280: variable 'network_connections' from source: include params 30582 1726855355.95283: variable 'interface' from source: play vars 30582 1726855355.95352: variable 'interface' from source: play vars 30582 1726855355.95422: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855355.95511: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855355.95517: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855355.95569: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855355.95739: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855355.96048: variable 'network_connections' from source: include params 30582 1726855355.96052: variable 'interface' from source: play vars 30582 1726855355.96095: variable 'interface' from source: play vars 30582 1726855355.96102: variable 'ansible_distribution' from source: facts 30582 1726855355.96105: variable '__network_rh_distros' from source: role '' defaults 30582 1726855355.96111: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.96121: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855355.96228: variable 'ansible_distribution' from source: facts 30582 1726855355.96231: variable '__network_rh_distros' from source: role '' defaults 30582 1726855355.96235: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.96247: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855355.96355: variable 'ansible_distribution' from source: facts 30582 1726855355.96359: variable '__network_rh_distros' from source: role '' defaults 30582 1726855355.96363: variable 'ansible_distribution_major_version' from source: facts 30582 1726855355.96394: variable 'network_provider' from source: set_fact 30582 1726855355.96405: variable 'ansible_facts' from source: unknown 30582 1726855355.96795: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855355.96798: when evaluation is False, skipping this task 30582 1726855355.96801: _execute() done 30582 1726855355.96803: dumping result to json 30582 1726855355.96805: done dumping result, returning 30582 1726855355.96818: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000001b43] 30582 1726855355.96821: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b43 30582 1726855355.96915: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b43 30582 1726855355.96918: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855355.96974: no more pending results, returning what we have 30582 1726855355.96978: results queue empty 30582 1726855355.96979: checking for any_errors_fatal 30582 1726855355.96986: done checking for any_errors_fatal 30582 1726855355.96989: checking for max_fail_percentage 30582 1726855355.96991: done checking for max_fail_percentage 30582 1726855355.96992: checking to see if all hosts have failed and the running result is not ok 30582 1726855355.96992: done checking to see if all hosts have failed 30582 1726855355.96993: getting the remaining hosts for this loop 30582 1726855355.96995: done getting the remaining hosts for this loop 30582 1726855355.96998: getting the next task for host managed_node3 30582 1726855355.97006: done getting next task for host managed_node3 30582 1726855355.97010: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855355.97015: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855355.97043: getting variables 30582 1726855355.97044: in VariableManager get_vars() 30582 1726855355.97103: Calling all_inventory to load vars for managed_node3 30582 1726855355.97106: Calling groups_inventory to load vars for managed_node3 30582 1726855355.97109: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855355.97118: Calling all_plugins_play to load vars for managed_node3 30582 1726855355.97121: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855355.97123: Calling groups_plugins_play to load vars for managed_node3 30582 1726855355.98794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855355.99683: done with get_vars() 30582 1726855355.99709: done getting variables 30582 1726855355.99758: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:35 -0400 (0:00:00.158) 0:01:32.347 ****** 30582 1726855355.99790: entering _queue_task() for managed_node3/package 30582 1726855356.00078: worker is 1 (out of 1 available) 30582 1726855356.00095: exiting _queue_task() for managed_node3/package 30582 1726855356.00108: done queuing things up, now waiting for results queue to drain 30582 1726855356.00110: waiting for pending results... 30582 1726855356.00296: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855356.00418: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b44 30582 1726855356.00431: variable 'ansible_search_path' from source: unknown 30582 1726855356.00434: variable 'ansible_search_path' from source: unknown 30582 1726855356.00507: calling self._execute() 30582 1726855356.00602: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855356.00606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855356.00608: variable 'omit' from source: magic vars 30582 1726855356.01061: variable 'ansible_distribution_major_version' from source: facts 30582 1726855356.01065: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855356.01169: variable 'network_state' from source: role '' defaults 30582 1726855356.01173: Evaluated conditional (network_state != {}): False 30582 1726855356.01176: when evaluation is False, skipping this task 30582 1726855356.01248: _execute() done 30582 1726855356.01252: dumping result to json 30582 1726855356.01255: done dumping result, returning 30582 1726855356.01258: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001b44] 30582 1726855356.01261: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b44 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855356.01402: no more pending results, returning what we have 30582 1726855356.01406: results queue empty 30582 1726855356.01408: checking for any_errors_fatal 30582 1726855356.01416: done checking for any_errors_fatal 30582 1726855356.01416: checking for max_fail_percentage 30582 1726855356.01418: done checking for max_fail_percentage 30582 1726855356.01419: checking to see if all hosts have failed and the running result is not ok 30582 1726855356.01420: done checking to see if all hosts have failed 30582 1726855356.01421: getting the remaining hosts for this loop 30582 1726855356.01423: done getting the remaining hosts for this loop 30582 1726855356.01426: getting the next task for host managed_node3 30582 1726855356.01437: done getting next task for host managed_node3 30582 1726855356.01441: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855356.01447: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855356.01484: getting variables 30582 1726855356.01486: in VariableManager get_vars() 30582 1726855356.01534: Calling all_inventory to load vars for managed_node3 30582 1726855356.01537: Calling groups_inventory to load vars for managed_node3 30582 1726855356.01540: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855356.01553: Calling all_plugins_play to load vars for managed_node3 30582 1726855356.01557: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855356.01560: Calling groups_plugins_play to load vars for managed_node3 30582 1726855356.02205: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b44 30582 1726855356.02209: WORKER PROCESS EXITING 30582 1726855356.02762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855356.03773: done with get_vars() 30582 1726855356.03805: done getting variables 30582 1726855356.03865: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:36 -0400 (0:00:00.041) 0:01:32.388 ****** 30582 1726855356.03908: entering _queue_task() for managed_node3/package 30582 1726855356.04305: worker is 1 (out of 1 available) 30582 1726855356.04320: exiting _queue_task() for managed_node3/package 30582 1726855356.04332: done queuing things up, now waiting for results queue to drain 30582 1726855356.04333: waiting for pending results... 30582 1726855356.04807: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855356.04818: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b45 30582 1726855356.04821: variable 'ansible_search_path' from source: unknown 30582 1726855356.04825: variable 'ansible_search_path' from source: unknown 30582 1726855356.04843: calling self._execute() 30582 1726855356.04947: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855356.04959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855356.04977: variable 'omit' from source: magic vars 30582 1726855356.05363: variable 'ansible_distribution_major_version' from source: facts 30582 1726855356.05384: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855356.05514: variable 'network_state' from source: role '' defaults 30582 1726855356.05530: Evaluated conditional (network_state != {}): False 30582 1726855356.05539: when evaluation is False, skipping this task 30582 1726855356.05546: _execute() done 30582 1726855356.05554: dumping result to json 30582 1726855356.05561: done dumping result, returning 30582 1726855356.05578: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001b45] 30582 1726855356.05592: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b45 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855356.05756: no more pending results, returning what we have 30582 1726855356.05760: results queue empty 30582 1726855356.05761: checking for any_errors_fatal 30582 1726855356.05767: done checking for any_errors_fatal 30582 1726855356.05768: checking for max_fail_percentage 30582 1726855356.05770: done checking for max_fail_percentage 30582 1726855356.05771: checking to see if all hosts have failed and the running result is not ok 30582 1726855356.05772: done checking to see if all hosts have failed 30582 1726855356.05773: getting the remaining hosts for this loop 30582 1726855356.05774: done getting the remaining hosts for this loop 30582 1726855356.05778: getting the next task for host managed_node3 30582 1726855356.05786: done getting next task for host managed_node3 30582 1726855356.05792: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855356.05798: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855356.05836: getting variables 30582 1726855356.05838: in VariableManager get_vars() 30582 1726855356.05883: Calling all_inventory to load vars for managed_node3 30582 1726855356.05886: Calling groups_inventory to load vars for managed_node3 30582 1726855356.05939: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855356.05952: Calling all_plugins_play to load vars for managed_node3 30582 1726855356.05955: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855356.05959: Calling groups_plugins_play to load vars for managed_node3 30582 1726855356.06529: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b45 30582 1726855356.06533: WORKER PROCESS EXITING 30582 1726855356.07028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855356.08091: done with get_vars() 30582 1726855356.08127: done getting variables 30582 1726855356.08197: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:36 -0400 (0:00:00.043) 0:01:32.432 ****** 30582 1726855356.08235: entering _queue_task() for managed_node3/service 30582 1726855356.08779: worker is 1 (out of 1 available) 30582 1726855356.08795: exiting _queue_task() for managed_node3/service 30582 1726855356.08807: done queuing things up, now waiting for results queue to drain 30582 1726855356.08809: waiting for pending results... 30582 1726855356.08915: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855356.09022: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b46 30582 1726855356.09035: variable 'ansible_search_path' from source: unknown 30582 1726855356.09038: variable 'ansible_search_path' from source: unknown 30582 1726855356.09090: calling self._execute() 30582 1726855356.09151: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855356.09155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855356.09168: variable 'omit' from source: magic vars 30582 1726855356.09467: variable 'ansible_distribution_major_version' from source: facts 30582 1726855356.09475: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855356.09569: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855356.09709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855356.11326: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855356.11378: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855356.11409: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855356.11435: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855356.11461: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855356.11525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.11559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.11582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.11611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.11622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.11658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.11677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.11699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.11723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.11733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.11761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.11782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.11803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.11827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.11837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.11963: variable 'network_connections' from source: include params 30582 1726855356.11975: variable 'interface' from source: play vars 30582 1726855356.12032: variable 'interface' from source: play vars 30582 1726855356.12085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855356.12198: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855356.12229: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855356.12251: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855356.12277: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855356.12310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855356.12328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855356.12349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.12366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855356.12409: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855356.12576: variable 'network_connections' from source: include params 30582 1726855356.12580: variable 'interface' from source: play vars 30582 1726855356.12626: variable 'interface' from source: play vars 30582 1726855356.12648: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855356.12651: when evaluation is False, skipping this task 30582 1726855356.12654: _execute() done 30582 1726855356.12658: dumping result to json 30582 1726855356.12660: done dumping result, returning 30582 1726855356.12671: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001b46] 30582 1726855356.12676: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b46 30582 1726855356.12766: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b46 30582 1726855356.12778: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855356.12826: no more pending results, returning what we have 30582 1726855356.12830: results queue empty 30582 1726855356.12831: checking for any_errors_fatal 30582 1726855356.12838: done checking for any_errors_fatal 30582 1726855356.12839: checking for max_fail_percentage 30582 1726855356.12841: done checking for max_fail_percentage 30582 1726855356.12842: checking to see if all hosts have failed and the running result is not ok 30582 1726855356.12843: done checking to see if all hosts have failed 30582 1726855356.12843: getting the remaining hosts for this loop 30582 1726855356.12845: done getting the remaining hosts for this loop 30582 1726855356.12848: getting the next task for host managed_node3 30582 1726855356.12856: done getting next task for host managed_node3 30582 1726855356.12859: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855356.12865: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855356.12895: getting variables 30582 1726855356.12897: in VariableManager get_vars() 30582 1726855356.12939: Calling all_inventory to load vars for managed_node3 30582 1726855356.12942: Calling groups_inventory to load vars for managed_node3 30582 1726855356.12944: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855356.12954: Calling all_plugins_play to load vars for managed_node3 30582 1726855356.12956: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855356.12959: Calling groups_plugins_play to load vars for managed_node3 30582 1726855356.13972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855356.15228: done with get_vars() 30582 1726855356.15255: done getting variables 30582 1726855356.15305: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:36 -0400 (0:00:00.070) 0:01:32.503 ****** 30582 1726855356.15333: entering _queue_task() for managed_node3/service 30582 1726855356.15616: worker is 1 (out of 1 available) 30582 1726855356.15630: exiting _queue_task() for managed_node3/service 30582 1726855356.15643: done queuing things up, now waiting for results queue to drain 30582 1726855356.15645: waiting for pending results... 30582 1726855356.15845: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855356.15942: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b47 30582 1726855356.15954: variable 'ansible_search_path' from source: unknown 30582 1726855356.15958: variable 'ansible_search_path' from source: unknown 30582 1726855356.15993: calling self._execute() 30582 1726855356.16064: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855356.16070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855356.16079: variable 'omit' from source: magic vars 30582 1726855356.16374: variable 'ansible_distribution_major_version' from source: facts 30582 1726855356.16383: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855356.16503: variable 'network_provider' from source: set_fact 30582 1726855356.16507: variable 'network_state' from source: role '' defaults 30582 1726855356.16518: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855356.16528: variable 'omit' from source: magic vars 30582 1726855356.16571: variable 'omit' from source: magic vars 30582 1726855356.16591: variable 'network_service_name' from source: role '' defaults 30582 1726855356.16643: variable 'network_service_name' from source: role '' defaults 30582 1726855356.16717: variable '__network_provider_setup' from source: role '' defaults 30582 1726855356.16720: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855356.16771: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855356.16778: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855356.16824: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855356.16983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855356.19422: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855356.19426: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855356.19442: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855356.19482: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855356.19511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855356.19596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.19641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.19667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.19712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.19727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.19775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.19801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.19855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.19863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.19881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.20182: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855356.20241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.20263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.20292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.20332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.20345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.20440: variable 'ansible_python' from source: facts 30582 1726855356.20456: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855356.20541: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855356.20623: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855356.20741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.20763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.20791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.20834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.20841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.20897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.20913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.20941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.21056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.21060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.21175: variable 'network_connections' from source: include params 30582 1726855356.21179: variable 'interface' from source: play vars 30582 1726855356.21208: variable 'interface' from source: play vars 30582 1726855356.21320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855356.21449: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855356.21490: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855356.21524: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855356.21554: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855356.21610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855356.21628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855356.21652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.21683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855356.21728: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855356.21917: variable 'network_connections' from source: include params 30582 1726855356.21921: variable 'interface' from source: play vars 30582 1726855356.21976: variable 'interface' from source: play vars 30582 1726855356.22001: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855356.22057: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855356.22243: variable 'network_connections' from source: include params 30582 1726855356.22246: variable 'interface' from source: play vars 30582 1726855356.22301: variable 'interface' from source: play vars 30582 1726855356.22319: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855356.22375: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855356.22559: variable 'network_connections' from source: include params 30582 1726855356.22563: variable 'interface' from source: play vars 30582 1726855356.22617: variable 'interface' from source: play vars 30582 1726855356.22654: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855356.22700: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855356.22705: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855356.22747: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855356.22881: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855356.23194: variable 'network_connections' from source: include params 30582 1726855356.23197: variable 'interface' from source: play vars 30582 1726855356.23242: variable 'interface' from source: play vars 30582 1726855356.23245: variable 'ansible_distribution' from source: facts 30582 1726855356.23250: variable '__network_rh_distros' from source: role '' defaults 30582 1726855356.23255: variable 'ansible_distribution_major_version' from source: facts 30582 1726855356.23269: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855356.23395: variable 'ansible_distribution' from source: facts 30582 1726855356.23398: variable '__network_rh_distros' from source: role '' defaults 30582 1726855356.23403: variable 'ansible_distribution_major_version' from source: facts 30582 1726855356.23414: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855356.23671: variable 'ansible_distribution' from source: facts 30582 1726855356.23674: variable '__network_rh_distros' from source: role '' defaults 30582 1726855356.23676: variable 'ansible_distribution_major_version' from source: facts 30582 1726855356.23678: variable 'network_provider' from source: set_fact 30582 1726855356.23680: variable 'omit' from source: magic vars 30582 1726855356.23682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855356.23719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855356.23737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855356.23754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855356.23769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855356.23810: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855356.23814: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855356.23816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855356.24012: Set connection var ansible_timeout to 10 30582 1726855356.24016: Set connection var ansible_connection to ssh 30582 1726855356.24018: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855356.24020: Set connection var ansible_pipelining to False 30582 1726855356.24022: Set connection var ansible_shell_executable to /bin/sh 30582 1726855356.24024: Set connection var ansible_shell_type to sh 30582 1726855356.24026: variable 'ansible_shell_executable' from source: unknown 30582 1726855356.24028: variable 'ansible_connection' from source: unknown 30582 1726855356.24030: variable 'ansible_module_compression' from source: unknown 30582 1726855356.24032: variable 'ansible_shell_type' from source: unknown 30582 1726855356.24033: variable 'ansible_shell_executable' from source: unknown 30582 1726855356.24035: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855356.24037: variable 'ansible_pipelining' from source: unknown 30582 1726855356.24039: variable 'ansible_timeout' from source: unknown 30582 1726855356.24041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855356.24120: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855356.24129: variable 'omit' from source: magic vars 30582 1726855356.24131: starting attempt loop 30582 1726855356.24134: running the handler 30582 1726855356.24228: variable 'ansible_facts' from source: unknown 30582 1726855356.24814: _low_level_execute_command(): starting 30582 1726855356.24818: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855356.25327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855356.25331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855356.25334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855356.25336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855356.25379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855356.25383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855356.25399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855356.25479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855356.27170: stdout chunk (state=3): >>>/root <<< 30582 1726855356.27292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855356.27321: stdout chunk (state=3): >>><<< 30582 1726855356.27324: stderr chunk (state=3): >>><<< 30582 1726855356.27327: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855356.27338: _low_level_execute_command(): starting 30582 1726855356.27344: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393 `" && echo ansible-tmp-1726855356.273273-34899-176295820420393="` echo /root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393 `" ) && sleep 0' 30582 1726855356.28026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855356.28031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855356.28109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855356.30029: stdout chunk (state=3): >>>ansible-tmp-1726855356.273273-34899-176295820420393=/root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393 <<< 30582 1726855356.30174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855356.30196: stderr chunk (state=3): >>><<< 30582 1726855356.30211: stdout chunk (state=3): >>><<< 30582 1726855356.30279: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855356.273273-34899-176295820420393=/root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855356.30283: variable 'ansible_module_compression' from source: unknown 30582 1726855356.30377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855356.30455: variable 'ansible_facts' from source: unknown 30582 1726855356.30885: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/AnsiballZ_systemd.py 30582 1726855356.31293: Sending initial data 30582 1726855356.31296: Sent initial data (155 bytes) 30582 1726855356.32500: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855356.32509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855356.32659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855356.32714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855356.32733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855356.32882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855356.34512: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855356.34575: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855356.34621: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpf4hyserj /root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/AnsiballZ_systemd.py <<< 30582 1726855356.34643: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/AnsiballZ_systemd.py" <<< 30582 1726855356.34747: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 30582 1726855356.34779: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpf4hyserj" to remote "/root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/AnsiballZ_systemd.py" <<< 30582 1726855356.36491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855356.36594: stderr chunk (state=3): >>><<< 30582 1726855356.36597: stdout chunk (state=3): >>><<< 30582 1726855356.36600: done transferring module to remote 30582 1726855356.36602: _low_level_execute_command(): starting 30582 1726855356.36605: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/ /root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/AnsiballZ_systemd.py && sleep 0' 30582 1726855356.37014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855356.37018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855356.37045: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855356.37048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855356.37051: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855356.37053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855356.37110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855356.37114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855356.37181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855356.39031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855356.39062: stderr chunk (state=3): >>><<< 30582 1726855356.39065: stdout chunk (state=3): >>><<< 30582 1726855356.39091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855356.39094: _low_level_execute_command(): starting 30582 1726855356.39095: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/AnsiballZ_systemd.py && sleep 0' 30582 1726855356.39517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855356.39538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855356.39541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855356.39605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855356.39608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855356.39678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855356.69079: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10670080", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317370880", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2224214000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855356.69111: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855356.71400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855356.71405: stdout chunk (state=3): >>><<< 30582 1726855356.71407: stderr chunk (state=3): >>><<< 30582 1726855356.71413: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10670080", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317370880", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2224214000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855356.71534: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855356.71564: _low_level_execute_command(): starting 30582 1726855356.71602: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855356.273273-34899-176295820420393/ > /dev/null 2>&1 && sleep 0' 30582 1726855356.72862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855356.72874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855356.72884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855356.72966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855356.72982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855356.72991: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855356.73001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855356.73015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855356.73022: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855356.73072: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855356.73112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855356.73188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855356.73303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855356.73395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855356.75246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855356.75345: stderr chunk (state=3): >>><<< 30582 1726855356.75349: stdout chunk (state=3): >>><<< 30582 1726855356.75364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855356.75440: handler run complete 30582 1726855356.75443: attempt loop complete, returning result 30582 1726855356.75445: _execute() done 30582 1726855356.75447: dumping result to json 30582 1726855356.75461: done dumping result, returning 30582 1726855356.75475: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-000000001b47] 30582 1726855356.75478: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b47 30582 1726855356.76292: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b47 30582 1726855356.76296: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855356.76362: no more pending results, returning what we have 30582 1726855356.76366: results queue empty 30582 1726855356.76368: checking for any_errors_fatal 30582 1726855356.76374: done checking for any_errors_fatal 30582 1726855356.76375: checking for max_fail_percentage 30582 1726855356.76377: done checking for max_fail_percentage 30582 1726855356.76378: checking to see if all hosts have failed and the running result is not ok 30582 1726855356.76379: done checking to see if all hosts have failed 30582 1726855356.76379: getting the remaining hosts for this loop 30582 1726855356.76381: done getting the remaining hosts for this loop 30582 1726855356.76384: getting the next task for host managed_node3 30582 1726855356.76394: done getting next task for host managed_node3 30582 1726855356.76398: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855356.76402: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855356.76421: getting variables 30582 1726855356.76423: in VariableManager get_vars() 30582 1726855356.76980: Calling all_inventory to load vars for managed_node3 30582 1726855356.76983: Calling groups_inventory to load vars for managed_node3 30582 1726855356.76987: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855356.77103: Calling all_plugins_play to load vars for managed_node3 30582 1726855356.77113: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855356.77117: Calling groups_plugins_play to load vars for managed_node3 30582 1726855356.79620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855356.81405: done with get_vars() 30582 1726855356.81441: done getting variables 30582 1726855356.81718: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:36 -0400 (0:00:00.664) 0:01:33.167 ****** 30582 1726855356.81753: entering _queue_task() for managed_node3/service 30582 1726855356.82213: worker is 1 (out of 1 available) 30582 1726855356.82235: exiting _queue_task() for managed_node3/service 30582 1726855356.82249: done queuing things up, now waiting for results queue to drain 30582 1726855356.82250: waiting for pending results... 30582 1726855356.82505: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855356.82675: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b48 30582 1726855356.82792: variable 'ansible_search_path' from source: unknown 30582 1726855356.82797: variable 'ansible_search_path' from source: unknown 30582 1726855356.82801: calling self._execute() 30582 1726855356.82916: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855356.82919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855356.82922: variable 'omit' from source: magic vars 30582 1726855356.83245: variable 'ansible_distribution_major_version' from source: facts 30582 1726855356.83267: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855356.83394: variable 'network_provider' from source: set_fact 30582 1726855356.83406: Evaluated conditional (network_provider == "nm"): True 30582 1726855356.83510: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855356.83615: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855356.83814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855356.86046: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855356.86173: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855356.86180: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855356.86224: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855356.86258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855356.86372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.86415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.86494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.86501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.86523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.86578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.86613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.86643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.86693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.86719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.86818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855356.86821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855356.86826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.86874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855356.86896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855356.87059: variable 'network_connections' from source: include params 30582 1726855356.87080: variable 'interface' from source: play vars 30582 1726855356.87160: variable 'interface' from source: play vars 30582 1726855356.87248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855356.87438: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855356.87577: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855356.87580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855356.87582: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855356.87605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855356.87629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855356.87656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855356.87695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855356.87748: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855356.88035: variable 'network_connections' from source: include params 30582 1726855356.88046: variable 'interface' from source: play vars 30582 1726855356.88134: variable 'interface' from source: play vars 30582 1726855356.88176: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855356.88186: when evaluation is False, skipping this task 30582 1726855356.88197: _execute() done 30582 1726855356.88205: dumping result to json 30582 1726855356.88212: done dumping result, returning 30582 1726855356.88232: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-000000001b48] 30582 1726855356.88342: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b48 30582 1726855356.88419: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b48 30582 1726855356.88423: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855356.88500: no more pending results, returning what we have 30582 1726855356.88504: results queue empty 30582 1726855356.88505: checking for any_errors_fatal 30582 1726855356.88522: done checking for any_errors_fatal 30582 1726855356.88523: checking for max_fail_percentage 30582 1726855356.88525: done checking for max_fail_percentage 30582 1726855356.88526: checking to see if all hosts have failed and the running result is not ok 30582 1726855356.88527: done checking to see if all hosts have failed 30582 1726855356.88527: getting the remaining hosts for this loop 30582 1726855356.88529: done getting the remaining hosts for this loop 30582 1726855356.88533: getting the next task for host managed_node3 30582 1726855356.88541: done getting next task for host managed_node3 30582 1726855356.88545: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855356.88551: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855356.88583: getting variables 30582 1726855356.88586: in VariableManager get_vars() 30582 1726855356.88633: Calling all_inventory to load vars for managed_node3 30582 1726855356.88636: Calling groups_inventory to load vars for managed_node3 30582 1726855356.88639: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855356.88650: Calling all_plugins_play to load vars for managed_node3 30582 1726855356.88654: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855356.88657: Calling groups_plugins_play to load vars for managed_node3 30582 1726855356.90896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855356.93258: done with get_vars() 30582 1726855356.93289: done getting variables 30582 1726855356.93340: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:36 -0400 (0:00:00.116) 0:01:33.283 ****** 30582 1726855356.93367: entering _queue_task() for managed_node3/service 30582 1726855356.93645: worker is 1 (out of 1 available) 30582 1726855356.93660: exiting _queue_task() for managed_node3/service 30582 1726855356.93674: done queuing things up, now waiting for results queue to drain 30582 1726855356.93676: waiting for pending results... 30582 1726855356.93896: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855356.94037: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b49 30582 1726855356.94050: variable 'ansible_search_path' from source: unknown 30582 1726855356.94053: variable 'ansible_search_path' from source: unknown 30582 1726855356.94091: calling self._execute() 30582 1726855356.94292: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855356.94296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855356.94299: variable 'omit' from source: magic vars 30582 1726855356.94589: variable 'ansible_distribution_major_version' from source: facts 30582 1726855356.94607: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855356.94728: variable 'network_provider' from source: set_fact 30582 1726855356.94740: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855356.94748: when evaluation is False, skipping this task 30582 1726855356.94755: _execute() done 30582 1726855356.94762: dumping result to json 30582 1726855356.94768: done dumping result, returning 30582 1726855356.94778: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-000000001b49] 30582 1726855356.94791: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b49 30582 1726855356.95227: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b49 30582 1726855356.95231: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855356.95284: no more pending results, returning what we have 30582 1726855356.95294: results queue empty 30582 1726855356.95296: checking for any_errors_fatal 30582 1726855356.95304: done checking for any_errors_fatal 30582 1726855356.95305: checking for max_fail_percentage 30582 1726855356.95307: done checking for max_fail_percentage 30582 1726855356.95308: checking to see if all hosts have failed and the running result is not ok 30582 1726855356.95309: done checking to see if all hosts have failed 30582 1726855356.95310: getting the remaining hosts for this loop 30582 1726855356.95312: done getting the remaining hosts for this loop 30582 1726855356.95316: getting the next task for host managed_node3 30582 1726855356.95326: done getting next task for host managed_node3 30582 1726855356.95331: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855356.95337: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855356.95378: getting variables 30582 1726855356.95381: in VariableManager get_vars() 30582 1726855356.95838: Calling all_inventory to load vars for managed_node3 30582 1726855356.95841: Calling groups_inventory to load vars for managed_node3 30582 1726855356.95843: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855356.95852: Calling all_plugins_play to load vars for managed_node3 30582 1726855356.95855: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855356.95857: Calling groups_plugins_play to load vars for managed_node3 30582 1726855356.99160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855357.02382: done with get_vars() 30582 1726855357.02420: done getting variables 30582 1726855357.02480: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:37 -0400 (0:00:00.092) 0:01:33.376 ****** 30582 1726855357.02626: entering _queue_task() for managed_node3/copy 30582 1726855357.03098: worker is 1 (out of 1 available) 30582 1726855357.03116: exiting _queue_task() for managed_node3/copy 30582 1726855357.03129: done queuing things up, now waiting for results queue to drain 30582 1726855357.03130: waiting for pending results... 30582 1726855357.03464: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855357.03793: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b4a 30582 1726855357.03797: variable 'ansible_search_path' from source: unknown 30582 1726855357.03800: variable 'ansible_search_path' from source: unknown 30582 1726855357.03803: calling self._execute() 30582 1726855357.03806: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.03809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.03811: variable 'omit' from source: magic vars 30582 1726855357.04185: variable 'ansible_distribution_major_version' from source: facts 30582 1726855357.04197: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855357.04327: variable 'network_provider' from source: set_fact 30582 1726855357.04333: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855357.04336: when evaluation is False, skipping this task 30582 1726855357.04338: _execute() done 30582 1726855357.04341: dumping result to json 30582 1726855357.04343: done dumping result, returning 30582 1726855357.04354: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-000000001b4a] 30582 1726855357.04358: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4a 30582 1726855357.04459: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4a 30582 1726855357.04463: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855357.04518: no more pending results, returning what we have 30582 1726855357.04523: results queue empty 30582 1726855357.04524: checking for any_errors_fatal 30582 1726855357.04647: done checking for any_errors_fatal 30582 1726855357.04649: checking for max_fail_percentage 30582 1726855357.04652: done checking for max_fail_percentage 30582 1726855357.04653: checking to see if all hosts have failed and the running result is not ok 30582 1726855357.04654: done checking to see if all hosts have failed 30582 1726855357.04655: getting the remaining hosts for this loop 30582 1726855357.04656: done getting the remaining hosts for this loop 30582 1726855357.04661: getting the next task for host managed_node3 30582 1726855357.04672: done getting next task for host managed_node3 30582 1726855357.04676: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855357.04682: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855357.04717: getting variables 30582 1726855357.04719: in VariableManager get_vars() 30582 1726855357.04767: Calling all_inventory to load vars for managed_node3 30582 1726855357.04770: Calling groups_inventory to load vars for managed_node3 30582 1726855357.04773: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855357.04786: Calling all_plugins_play to load vars for managed_node3 30582 1726855357.04906: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855357.04911: Calling groups_plugins_play to load vars for managed_node3 30582 1726855357.13289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855357.14962: done with get_vars() 30582 1726855357.15010: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:37 -0400 (0:00:00.124) 0:01:33.500 ****** 30582 1726855357.15097: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855357.15584: worker is 1 (out of 1 available) 30582 1726855357.15601: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855357.15612: done queuing things up, now waiting for results queue to drain 30582 1726855357.15614: waiting for pending results... 30582 1726855357.15903: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855357.16094: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b4b 30582 1726855357.16101: variable 'ansible_search_path' from source: unknown 30582 1726855357.16105: variable 'ansible_search_path' from source: unknown 30582 1726855357.16127: calling self._execute() 30582 1726855357.16295: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.16301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.16304: variable 'omit' from source: magic vars 30582 1726855357.16649: variable 'ansible_distribution_major_version' from source: facts 30582 1726855357.16667: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855357.16676: variable 'omit' from source: magic vars 30582 1726855357.16735: variable 'omit' from source: magic vars 30582 1726855357.16911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855357.19223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855357.19310: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855357.19493: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855357.19497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855357.19500: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855357.19503: variable 'network_provider' from source: set_fact 30582 1726855357.19637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855357.19664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855357.19699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855357.19738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855357.19750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855357.19833: variable 'omit' from source: magic vars 30582 1726855357.19951: variable 'omit' from source: magic vars 30582 1726855357.20061: variable 'network_connections' from source: include params 30582 1726855357.20076: variable 'interface' from source: play vars 30582 1726855357.20144: variable 'interface' from source: play vars 30582 1726855357.20307: variable 'omit' from source: magic vars 30582 1726855357.20316: variable '__lsr_ansible_managed' from source: task vars 30582 1726855357.20382: variable '__lsr_ansible_managed' from source: task vars 30582 1726855357.20562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855357.20795: Loaded config def from plugin (lookup/template) 30582 1726855357.20799: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855357.20833: File lookup term: get_ansible_managed.j2 30582 1726855357.20836: variable 'ansible_search_path' from source: unknown 30582 1726855357.20840: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855357.20855: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855357.20892: variable 'ansible_search_path' from source: unknown 30582 1726855357.27582: variable 'ansible_managed' from source: unknown 30582 1726855357.27994: variable 'omit' from source: magic vars 30582 1726855357.27999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855357.28002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855357.28004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855357.28006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855357.28009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855357.28011: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855357.28013: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.28015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.28018: Set connection var ansible_timeout to 10 30582 1726855357.28020: Set connection var ansible_connection to ssh 30582 1726855357.28022: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855357.28024: Set connection var ansible_pipelining to False 30582 1726855357.28030: Set connection var ansible_shell_executable to /bin/sh 30582 1726855357.28032: Set connection var ansible_shell_type to sh 30582 1726855357.28056: variable 'ansible_shell_executable' from source: unknown 30582 1726855357.28060: variable 'ansible_connection' from source: unknown 30582 1726855357.28062: variable 'ansible_module_compression' from source: unknown 30582 1726855357.28064: variable 'ansible_shell_type' from source: unknown 30582 1726855357.28069: variable 'ansible_shell_executable' from source: unknown 30582 1726855357.28072: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.28075: variable 'ansible_pipelining' from source: unknown 30582 1726855357.28079: variable 'ansible_timeout' from source: unknown 30582 1726855357.28083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.28228: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855357.28241: variable 'omit' from source: magic vars 30582 1726855357.28244: starting attempt loop 30582 1726855357.28246: running the handler 30582 1726855357.28262: _low_level_execute_command(): starting 30582 1726855357.28272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855357.29091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855357.29121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855357.29135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855357.29177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855357.29241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855357.30962: stdout chunk (state=3): >>>/root <<< 30582 1726855357.31115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855357.31118: stdout chunk (state=3): >>><<< 30582 1726855357.31120: stderr chunk (state=3): >>><<< 30582 1726855357.31138: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855357.31157: _low_level_execute_command(): starting 30582 1726855357.31248: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111 `" && echo ansible-tmp-1726855357.311452-34949-63324208503111="` echo /root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111 `" ) && sleep 0' 30582 1726855357.31828: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855357.31842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855357.31860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855357.31882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855357.31904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855357.31946: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855357.32019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855357.32043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855357.32071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855357.32173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855357.34103: stdout chunk (state=3): >>>ansible-tmp-1726855357.311452-34949-63324208503111=/root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111 <<< 30582 1726855357.34194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855357.34393: stderr chunk (state=3): >>><<< 30582 1726855357.34396: stdout chunk (state=3): >>><<< 30582 1726855357.34400: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855357.311452-34949-63324208503111=/root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855357.34402: variable 'ansible_module_compression' from source: unknown 30582 1726855357.34404: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855357.34406: variable 'ansible_facts' from source: unknown 30582 1726855357.34532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/AnsiballZ_network_connections.py 30582 1726855357.34754: Sending initial data 30582 1726855357.34757: Sent initial data (166 bytes) 30582 1726855357.35261: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855357.35299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855357.35402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855357.35410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855357.35478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855357.37082: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855357.37143: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855357.37215: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpcop5zvnz /root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/AnsiballZ_network_connections.py <<< 30582 1726855357.37219: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/AnsiballZ_network_connections.py" <<< 30582 1726855357.37278: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpcop5zvnz" to remote "/root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/AnsiballZ_network_connections.py" <<< 30582 1726855357.38096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855357.38200: stderr chunk (state=3): >>><<< 30582 1726855357.38203: stdout chunk (state=3): >>><<< 30582 1726855357.38210: done transferring module to remote 30582 1726855357.38222: _low_level_execute_command(): starting 30582 1726855357.38230: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/ /root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/AnsiballZ_network_connections.py && sleep 0' 30582 1726855357.38762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855357.38768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855357.38799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855357.38803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855357.38805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855357.38807: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855357.38861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855357.38869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855357.38871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855357.38926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855357.40728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855357.40753: stderr chunk (state=3): >>><<< 30582 1726855357.40758: stdout chunk (state=3): >>><<< 30582 1726855357.40775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855357.40778: _low_level_execute_command(): starting 30582 1726855357.40781: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/AnsiballZ_network_connections.py && sleep 0' 30582 1726855357.41226: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855357.41229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855357.41231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855357.41233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855357.41235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855357.41286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855357.41293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855357.41296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855357.41364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855357.69226: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_c45ma2kz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_c45ma2kz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/07988b43-0bc6-4bfd-8ab8-3bff1d23cced: error=unknown <<< 30582 1726855357.69296: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855357.71290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855357.71295: stdout chunk (state=3): >>><<< 30582 1726855357.71297: stderr chunk (state=3): >>><<< 30582 1726855357.71299: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_c45ma2kz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_c45ma2kz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/07988b43-0bc6-4bfd-8ab8-3bff1d23cced: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855357.71302: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855357.71304: _low_level_execute_command(): starting 30582 1726855357.71306: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855357.311452-34949-63324208503111/ > /dev/null 2>&1 && sleep 0' 30582 1726855357.71940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855357.71957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855357.72002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855357.72015: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855357.72106: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855357.72123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855357.72145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855357.72243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855357.74162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855357.74196: stdout chunk (state=3): >>><<< 30582 1726855357.74200: stderr chunk (state=3): >>><<< 30582 1726855357.74216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855357.74392: handler run complete 30582 1726855357.74396: attempt loop complete, returning result 30582 1726855357.74398: _execute() done 30582 1726855357.74400: dumping result to json 30582 1726855357.74402: done dumping result, returning 30582 1726855357.74404: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-000000001b4b] 30582 1726855357.74406: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4b 30582 1726855357.74482: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4b 30582 1726855357.74485: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30582 1726855357.74612: no more pending results, returning what we have 30582 1726855357.74617: results queue empty 30582 1726855357.74618: checking for any_errors_fatal 30582 1726855357.74626: done checking for any_errors_fatal 30582 1726855357.74627: checking for max_fail_percentage 30582 1726855357.74629: done checking for max_fail_percentage 30582 1726855357.74631: checking to see if all hosts have failed and the running result is not ok 30582 1726855357.74631: done checking to see if all hosts have failed 30582 1726855357.74632: getting the remaining hosts for this loop 30582 1726855357.74634: done getting the remaining hosts for this loop 30582 1726855357.74638: getting the next task for host managed_node3 30582 1726855357.74646: done getting next task for host managed_node3 30582 1726855357.74650: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855357.74657: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855357.74676: getting variables 30582 1726855357.74679: in VariableManager get_vars() 30582 1726855357.74938: Calling all_inventory to load vars for managed_node3 30582 1726855357.74941: Calling groups_inventory to load vars for managed_node3 30582 1726855357.74943: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855357.74954: Calling all_plugins_play to load vars for managed_node3 30582 1726855357.74957: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855357.74959: Calling groups_plugins_play to load vars for managed_node3 30582 1726855357.76671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855357.78291: done with get_vars() 30582 1726855357.78322: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:37 -0400 (0:00:00.632) 0:01:34.133 ****** 30582 1726855357.78394: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855357.78657: worker is 1 (out of 1 available) 30582 1726855357.78670: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855357.78682: done queuing things up, now waiting for results queue to drain 30582 1726855357.78684: waiting for pending results... 30582 1726855357.78878: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855357.78991: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b4c 30582 1726855357.79003: variable 'ansible_search_path' from source: unknown 30582 1726855357.79006: variable 'ansible_search_path' from source: unknown 30582 1726855357.79037: calling self._execute() 30582 1726855357.79112: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.79116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.79125: variable 'omit' from source: magic vars 30582 1726855357.79415: variable 'ansible_distribution_major_version' from source: facts 30582 1726855357.79424: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855357.79518: variable 'network_state' from source: role '' defaults 30582 1726855357.79526: Evaluated conditional (network_state != {}): False 30582 1726855357.79529: when evaluation is False, skipping this task 30582 1726855357.79532: _execute() done 30582 1726855357.79534: dumping result to json 30582 1726855357.79536: done dumping result, returning 30582 1726855357.79544: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-000000001b4c] 30582 1726855357.79549: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4c 30582 1726855357.79641: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4c 30582 1726855357.79645: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855357.79718: no more pending results, returning what we have 30582 1726855357.79722: results queue empty 30582 1726855357.79724: checking for any_errors_fatal 30582 1726855357.79735: done checking for any_errors_fatal 30582 1726855357.79736: checking for max_fail_percentage 30582 1726855357.79737: done checking for max_fail_percentage 30582 1726855357.79739: checking to see if all hosts have failed and the running result is not ok 30582 1726855357.79739: done checking to see if all hosts have failed 30582 1726855357.79740: getting the remaining hosts for this loop 30582 1726855357.79742: done getting the remaining hosts for this loop 30582 1726855357.79745: getting the next task for host managed_node3 30582 1726855357.79753: done getting next task for host managed_node3 30582 1726855357.79757: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855357.79762: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855357.79786: getting variables 30582 1726855357.79789: in VariableManager get_vars() 30582 1726855357.79934: Calling all_inventory to load vars for managed_node3 30582 1726855357.79937: Calling groups_inventory to load vars for managed_node3 30582 1726855357.79940: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855357.79949: Calling all_plugins_play to load vars for managed_node3 30582 1726855357.79952: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855357.79956: Calling groups_plugins_play to load vars for managed_node3 30582 1726855357.81389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855357.82781: done with get_vars() 30582 1726855357.82809: done getting variables 30582 1726855357.82854: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:37 -0400 (0:00:00.044) 0:01:34.178 ****** 30582 1726855357.82886: entering _queue_task() for managed_node3/debug 30582 1726855357.83156: worker is 1 (out of 1 available) 30582 1726855357.83173: exiting _queue_task() for managed_node3/debug 30582 1726855357.83185: done queuing things up, now waiting for results queue to drain 30582 1726855357.83189: waiting for pending results... 30582 1726855357.83379: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855357.83496: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b4d 30582 1726855357.83507: variable 'ansible_search_path' from source: unknown 30582 1726855357.83510: variable 'ansible_search_path' from source: unknown 30582 1726855357.83541: calling self._execute() 30582 1726855357.83614: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.83617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.83628: variable 'omit' from source: magic vars 30582 1726855357.83913: variable 'ansible_distribution_major_version' from source: facts 30582 1726855357.83923: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855357.83929: variable 'omit' from source: magic vars 30582 1726855357.83973: variable 'omit' from source: magic vars 30582 1726855357.83999: variable 'omit' from source: magic vars 30582 1726855357.84032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855357.84058: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855357.84076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855357.84091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855357.84101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855357.84143: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855357.84146: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.84149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.84224: Set connection var ansible_timeout to 10 30582 1726855357.84227: Set connection var ansible_connection to ssh 30582 1726855357.84232: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855357.84237: Set connection var ansible_pipelining to False 30582 1726855357.84242: Set connection var ansible_shell_executable to /bin/sh 30582 1726855357.84245: Set connection var ansible_shell_type to sh 30582 1726855357.84262: variable 'ansible_shell_executable' from source: unknown 30582 1726855357.84268: variable 'ansible_connection' from source: unknown 30582 1726855357.84271: variable 'ansible_module_compression' from source: unknown 30582 1726855357.84273: variable 'ansible_shell_type' from source: unknown 30582 1726855357.84275: variable 'ansible_shell_executable' from source: unknown 30582 1726855357.84278: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.84280: variable 'ansible_pipelining' from source: unknown 30582 1726855357.84282: variable 'ansible_timeout' from source: unknown 30582 1726855357.84286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.84382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855357.84392: variable 'omit' from source: magic vars 30582 1726855357.84403: starting attempt loop 30582 1726855357.84406: running the handler 30582 1726855357.84536: variable '__network_connections_result' from source: set_fact 30582 1726855357.84672: handler run complete 30582 1726855357.84676: attempt loop complete, returning result 30582 1726855357.84678: _execute() done 30582 1726855357.84680: dumping result to json 30582 1726855357.84683: done dumping result, returning 30582 1726855357.84685: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000001b4d] 30582 1726855357.84689: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4d 30582 1726855357.84759: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4d 30582 1726855357.84762: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 30582 1726855357.84860: no more pending results, returning what we have 30582 1726855357.84867: results queue empty 30582 1726855357.84869: checking for any_errors_fatal 30582 1726855357.84876: done checking for any_errors_fatal 30582 1726855357.84877: checking for max_fail_percentage 30582 1726855357.84880: done checking for max_fail_percentage 30582 1726855357.84881: checking to see if all hosts have failed and the running result is not ok 30582 1726855357.84882: done checking to see if all hosts have failed 30582 1726855357.84882: getting the remaining hosts for this loop 30582 1726855357.84884: done getting the remaining hosts for this loop 30582 1726855357.84906: getting the next task for host managed_node3 30582 1726855357.84915: done getting next task for host managed_node3 30582 1726855357.84919: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855357.84925: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855357.84939: getting variables 30582 1726855357.84941: in VariableManager get_vars() 30582 1726855357.84983: Calling all_inventory to load vars for managed_node3 30582 1726855357.84986: Calling groups_inventory to load vars for managed_node3 30582 1726855357.84990: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855357.85005: Calling all_plugins_play to load vars for managed_node3 30582 1726855357.85008: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855357.85010: Calling groups_plugins_play to load vars for managed_node3 30582 1726855357.86131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855357.87142: done with get_vars() 30582 1726855357.87159: done getting variables 30582 1726855357.87208: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:37 -0400 (0:00:00.043) 0:01:34.222 ****** 30582 1726855357.87238: entering _queue_task() for managed_node3/debug 30582 1726855357.87500: worker is 1 (out of 1 available) 30582 1726855357.87513: exiting _queue_task() for managed_node3/debug 30582 1726855357.87524: done queuing things up, now waiting for results queue to drain 30582 1726855357.87526: waiting for pending results... 30582 1726855357.87713: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855357.87808: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b4e 30582 1726855357.87820: variable 'ansible_search_path' from source: unknown 30582 1726855357.87823: variable 'ansible_search_path' from source: unknown 30582 1726855357.87852: calling self._execute() 30582 1726855357.87926: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.87931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.87937: variable 'omit' from source: magic vars 30582 1726855357.88227: variable 'ansible_distribution_major_version' from source: facts 30582 1726855357.88236: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855357.88242: variable 'omit' from source: magic vars 30582 1726855357.88286: variable 'omit' from source: magic vars 30582 1726855357.88313: variable 'omit' from source: magic vars 30582 1726855357.88346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855357.88373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855357.88390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855357.88405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855357.88416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855357.88438: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855357.88441: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.88444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.88518: Set connection var ansible_timeout to 10 30582 1726855357.88521: Set connection var ansible_connection to ssh 30582 1726855357.88527: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855357.88531: Set connection var ansible_pipelining to False 30582 1726855357.88536: Set connection var ansible_shell_executable to /bin/sh 30582 1726855357.88539: Set connection var ansible_shell_type to sh 30582 1726855357.88555: variable 'ansible_shell_executable' from source: unknown 30582 1726855357.88558: variable 'ansible_connection' from source: unknown 30582 1726855357.88560: variable 'ansible_module_compression' from source: unknown 30582 1726855357.88563: variable 'ansible_shell_type' from source: unknown 30582 1726855357.88567: variable 'ansible_shell_executable' from source: unknown 30582 1726855357.88570: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.88572: variable 'ansible_pipelining' from source: unknown 30582 1726855357.88574: variable 'ansible_timeout' from source: unknown 30582 1726855357.88576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.88680: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855357.88690: variable 'omit' from source: magic vars 30582 1726855357.88696: starting attempt loop 30582 1726855357.88698: running the handler 30582 1726855357.88738: variable '__network_connections_result' from source: set_fact 30582 1726855357.88798: variable '__network_connections_result' from source: set_fact 30582 1726855357.88869: handler run complete 30582 1726855357.88886: attempt loop complete, returning result 30582 1726855357.88892: _execute() done 30582 1726855357.88895: dumping result to json 30582 1726855357.88897: done dumping result, returning 30582 1726855357.88905: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000001b4e] 30582 1726855357.88909: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4e 30582 1726855357.88998: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4e 30582 1726855357.89001: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30582 1726855357.89090: no more pending results, returning what we have 30582 1726855357.89094: results queue empty 30582 1726855357.89095: checking for any_errors_fatal 30582 1726855357.89102: done checking for any_errors_fatal 30582 1726855357.89103: checking for max_fail_percentage 30582 1726855357.89104: done checking for max_fail_percentage 30582 1726855357.89105: checking to see if all hosts have failed and the running result is not ok 30582 1726855357.89106: done checking to see if all hosts have failed 30582 1726855357.89107: getting the remaining hosts for this loop 30582 1726855357.89108: done getting the remaining hosts for this loop 30582 1726855357.89111: getting the next task for host managed_node3 30582 1726855357.89120: done getting next task for host managed_node3 30582 1726855357.89123: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855357.89128: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855357.89141: getting variables 30582 1726855357.89142: in VariableManager get_vars() 30582 1726855357.89182: Calling all_inventory to load vars for managed_node3 30582 1726855357.89185: Calling groups_inventory to load vars for managed_node3 30582 1726855357.89191: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855357.89201: Calling all_plugins_play to load vars for managed_node3 30582 1726855357.89203: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855357.89206: Calling groups_plugins_play to load vars for managed_node3 30582 1726855357.90012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855357.90885: done with get_vars() 30582 1726855357.90905: done getting variables 30582 1726855357.90949: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:37 -0400 (0:00:00.037) 0:01:34.259 ****** 30582 1726855357.90980: entering _queue_task() for managed_node3/debug 30582 1726855357.91238: worker is 1 (out of 1 available) 30582 1726855357.91253: exiting _queue_task() for managed_node3/debug 30582 1726855357.91269: done queuing things up, now waiting for results queue to drain 30582 1726855357.91271: waiting for pending results... 30582 1726855357.91454: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855357.91557: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b4f 30582 1726855357.91570: variable 'ansible_search_path' from source: unknown 30582 1726855357.91573: variable 'ansible_search_path' from source: unknown 30582 1726855357.91605: calling self._execute() 30582 1726855357.91671: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.91675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.91684: variable 'omit' from source: magic vars 30582 1726855357.91967: variable 'ansible_distribution_major_version' from source: facts 30582 1726855357.91975: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855357.92059: variable 'network_state' from source: role '' defaults 30582 1726855357.92070: Evaluated conditional (network_state != {}): False 30582 1726855357.92073: when evaluation is False, skipping this task 30582 1726855357.92076: _execute() done 30582 1726855357.92079: dumping result to json 30582 1726855357.92081: done dumping result, returning 30582 1726855357.92090: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-000000001b4f] 30582 1726855357.92093: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4f 30582 1726855357.92184: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b4f 30582 1726855357.92189: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855357.92233: no more pending results, returning what we have 30582 1726855357.92237: results queue empty 30582 1726855357.92238: checking for any_errors_fatal 30582 1726855357.92247: done checking for any_errors_fatal 30582 1726855357.92247: checking for max_fail_percentage 30582 1726855357.92249: done checking for max_fail_percentage 30582 1726855357.92250: checking to see if all hosts have failed and the running result is not ok 30582 1726855357.92251: done checking to see if all hosts have failed 30582 1726855357.92253: getting the remaining hosts for this loop 30582 1726855357.92254: done getting the remaining hosts for this loop 30582 1726855357.92257: getting the next task for host managed_node3 30582 1726855357.92268: done getting next task for host managed_node3 30582 1726855357.92271: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855357.92276: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855357.92304: getting variables 30582 1726855357.92306: in VariableManager get_vars() 30582 1726855357.92343: Calling all_inventory to load vars for managed_node3 30582 1726855357.92345: Calling groups_inventory to load vars for managed_node3 30582 1726855357.92348: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855357.92357: Calling all_plugins_play to load vars for managed_node3 30582 1726855357.92359: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855357.92361: Calling groups_plugins_play to load vars for managed_node3 30582 1726855357.93297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855357.94155: done with get_vars() 30582 1726855357.94175: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:37 -0400 (0:00:00.032) 0:01:34.292 ****** 30582 1726855357.94244: entering _queue_task() for managed_node3/ping 30582 1726855357.94494: worker is 1 (out of 1 available) 30582 1726855357.94507: exiting _queue_task() for managed_node3/ping 30582 1726855357.94520: done queuing things up, now waiting for results queue to drain 30582 1726855357.94522: waiting for pending results... 30582 1726855357.94709: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855357.94810: in run() - task 0affcc66-ac2b-aa83-7d57-000000001b50 30582 1726855357.94823: variable 'ansible_search_path' from source: unknown 30582 1726855357.94827: variable 'ansible_search_path' from source: unknown 30582 1726855357.94857: calling self._execute() 30582 1726855357.94927: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.94931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.94943: variable 'omit' from source: magic vars 30582 1726855357.95223: variable 'ansible_distribution_major_version' from source: facts 30582 1726855357.95232: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855357.95238: variable 'omit' from source: magic vars 30582 1726855357.95283: variable 'omit' from source: magic vars 30582 1726855357.95307: variable 'omit' from source: magic vars 30582 1726855357.95337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855357.95366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855357.95384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855357.95399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855357.95410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855357.95434: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855357.95437: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.95439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.95513: Set connection var ansible_timeout to 10 30582 1726855357.95517: Set connection var ansible_connection to ssh 30582 1726855357.95522: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855357.95527: Set connection var ansible_pipelining to False 30582 1726855357.95533: Set connection var ansible_shell_executable to /bin/sh 30582 1726855357.95535: Set connection var ansible_shell_type to sh 30582 1726855357.95551: variable 'ansible_shell_executable' from source: unknown 30582 1726855357.95554: variable 'ansible_connection' from source: unknown 30582 1726855357.95557: variable 'ansible_module_compression' from source: unknown 30582 1726855357.95559: variable 'ansible_shell_type' from source: unknown 30582 1726855357.95561: variable 'ansible_shell_executable' from source: unknown 30582 1726855357.95566: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855357.95568: variable 'ansible_pipelining' from source: unknown 30582 1726855357.95570: variable 'ansible_timeout' from source: unknown 30582 1726855357.95572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855357.95719: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855357.95726: variable 'omit' from source: magic vars 30582 1726855357.95731: starting attempt loop 30582 1726855357.95734: running the handler 30582 1726855357.95745: _low_level_execute_command(): starting 30582 1726855357.95752: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855357.96248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855357.96283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855357.96286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855357.96291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855357.96293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855357.96343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855357.96346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855357.96348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855357.96420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855357.98113: stdout chunk (state=3): >>>/root <<< 30582 1726855357.98210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855357.98241: stderr chunk (state=3): >>><<< 30582 1726855357.98244: stdout chunk (state=3): >>><<< 30582 1726855357.98267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855357.98282: _low_level_execute_command(): starting 30582 1726855357.98290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982 `" && echo ansible-tmp-1726855357.9826982-34995-46760694780982="` echo /root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982 `" ) && sleep 0' 30582 1726855357.98735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855357.98739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855357.98741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855357.98754: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855357.98756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855357.98804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855357.98807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855357.98812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855357.98873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855358.00821: stdout chunk (state=3): >>>ansible-tmp-1726855357.9826982-34995-46760694780982=/root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982 <<< 30582 1726855358.00927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855358.00953: stderr chunk (state=3): >>><<< 30582 1726855358.00956: stdout chunk (state=3): >>><<< 30582 1726855358.00974: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855357.9826982-34995-46760694780982=/root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855358.01015: variable 'ansible_module_compression' from source: unknown 30582 1726855358.01048: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855358.01077: variable 'ansible_facts' from source: unknown 30582 1726855358.01132: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/AnsiballZ_ping.py 30582 1726855358.01229: Sending initial data 30582 1726855358.01233: Sent initial data (152 bytes) 30582 1726855358.01653: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855358.01660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855358.01685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855358.01697: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855358.01744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855358.01748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855358.01815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855358.03436: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855358.03507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855358.03578: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpjbie41pp /root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/AnsiballZ_ping.py <<< 30582 1726855358.03582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/AnsiballZ_ping.py" <<< 30582 1726855358.03651: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpjbie41pp" to remote "/root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/AnsiballZ_ping.py" <<< 30582 1726855358.04556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855358.04692: stderr chunk (state=3): >>><<< 30582 1726855358.04695: stdout chunk (state=3): >>><<< 30582 1726855358.04697: done transferring module to remote 30582 1726855358.04699: _low_level_execute_command(): starting 30582 1726855358.04702: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/ /root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/AnsiballZ_ping.py && sleep 0' 30582 1726855358.05683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855358.05712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855358.05934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855358.06030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855358.07834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855358.07880: stderr chunk (state=3): >>><<< 30582 1726855358.07884: stdout chunk (state=3): >>><<< 30582 1726855358.07904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855358.07908: _low_level_execute_command(): starting 30582 1726855358.07990: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/AnsiballZ_ping.py && sleep 0' 30582 1726855358.08635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855358.08639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855358.08641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855358.08644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855358.08646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855358.08648: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855358.08649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855358.08652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855358.08655: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855358.08896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855358.08899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855358.08901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855358.08903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855358.23936: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855358.25390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855358.25403: stdout chunk (state=3): >>><<< 30582 1726855358.25417: stderr chunk (state=3): >>><<< 30582 1726855358.25442: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855358.25480: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855358.25499: _low_level_execute_command(): starting 30582 1726855358.25508: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855357.9826982-34995-46760694780982/ > /dev/null 2>&1 && sleep 0' 30582 1726855358.26152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855358.26171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855358.26185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855358.26258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855358.26312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855358.26336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855358.26434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855358.28363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855358.28389: stdout chunk (state=3): >>><<< 30582 1726855358.28403: stderr chunk (state=3): >>><<< 30582 1726855358.28425: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855358.28437: handler run complete 30582 1726855358.28458: attempt loop complete, returning result 30582 1726855358.28467: _execute() done 30582 1726855358.28485: dumping result to json 30582 1726855358.28490: done dumping result, returning 30582 1726855358.28593: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-000000001b50] 30582 1726855358.28596: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b50 30582 1726855358.28668: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001b50 30582 1726855358.28672: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855358.28752: no more pending results, returning what we have 30582 1726855358.28757: results queue empty 30582 1726855358.28758: checking for any_errors_fatal 30582 1726855358.28766: done checking for any_errors_fatal 30582 1726855358.28767: checking for max_fail_percentage 30582 1726855358.28769: done checking for max_fail_percentage 30582 1726855358.28770: checking to see if all hosts have failed and the running result is not ok 30582 1726855358.28771: done checking to see if all hosts have failed 30582 1726855358.28772: getting the remaining hosts for this loop 30582 1726855358.28773: done getting the remaining hosts for this loop 30582 1726855358.28777: getting the next task for host managed_node3 30582 1726855358.28791: done getting next task for host managed_node3 30582 1726855358.28793: ^ task is: TASK: meta (role_complete) 30582 1726855358.28799: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855358.28815: getting variables 30582 1726855358.28817: in VariableManager get_vars() 30582 1726855358.28862: Calling all_inventory to load vars for managed_node3 30582 1726855358.28865: Calling groups_inventory to load vars for managed_node3 30582 1726855358.28868: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855358.28880: Calling all_plugins_play to load vars for managed_node3 30582 1726855358.28884: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855358.29090: Calling groups_plugins_play to load vars for managed_node3 30582 1726855358.30877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855358.32569: done with get_vars() 30582 1726855358.32822: done getting variables 30582 1726855358.32913: done queuing things up, now waiting for results queue to drain 30582 1726855358.32916: results queue empty 30582 1726855358.32917: checking for any_errors_fatal 30582 1726855358.32920: done checking for any_errors_fatal 30582 1726855358.32921: checking for max_fail_percentage 30582 1726855358.32922: done checking for max_fail_percentage 30582 1726855358.32923: checking to see if all hosts have failed and the running result is not ok 30582 1726855358.32924: done checking to see if all hosts have failed 30582 1726855358.32924: getting the remaining hosts for this loop 30582 1726855358.32925: done getting the remaining hosts for this loop 30582 1726855358.32928: getting the next task for host managed_node3 30582 1726855358.32934: done getting next task for host managed_node3 30582 1726855358.32936: ^ task is: TASK: Test 30582 1726855358.32939: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855358.32941: getting variables 30582 1726855358.32942: in VariableManager get_vars() 30582 1726855358.32956: Calling all_inventory to load vars for managed_node3 30582 1726855358.32958: Calling groups_inventory to load vars for managed_node3 30582 1726855358.32960: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855358.32965: Calling all_plugins_play to load vars for managed_node3 30582 1726855358.32967: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855358.32970: Calling groups_plugins_play to load vars for managed_node3 30582 1726855358.35931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855358.38958: done with get_vars() 30582 1726855358.39020: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 14:02:38 -0400 (0:00:00.448) 0:01:34.740 ****** 30582 1726855358.39107: entering _queue_task() for managed_node3/include_tasks 30582 1726855358.39690: worker is 1 (out of 1 available) 30582 1726855358.39704: exiting _queue_task() for managed_node3/include_tasks 30582 1726855358.39716: done queuing things up, now waiting for results queue to drain 30582 1726855358.39718: waiting for pending results... 30582 1726855358.40211: running TaskExecutor() for managed_node3/TASK: Test 30582 1726855358.40299: in run() - task 0affcc66-ac2b-aa83-7d57-000000001748 30582 1726855358.40306: variable 'ansible_search_path' from source: unknown 30582 1726855358.40309: variable 'ansible_search_path' from source: unknown 30582 1726855358.40340: variable 'lsr_test' from source: include params 30582 1726855358.40598: variable 'lsr_test' from source: include params 30582 1726855358.40714: variable 'omit' from source: magic vars 30582 1726855358.40881: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855358.40934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855358.40938: variable 'omit' from source: magic vars 30582 1726855358.41247: variable 'ansible_distribution_major_version' from source: facts 30582 1726855358.41262: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855358.41281: variable 'item' from source: unknown 30582 1726855358.41390: variable 'item' from source: unknown 30582 1726855358.41393: variable 'item' from source: unknown 30582 1726855358.41461: variable 'item' from source: unknown 30582 1726855358.42153: dumping result to json 30582 1726855358.42156: done dumping result, returning 30582 1726855358.42159: done running TaskExecutor() for managed_node3/TASK: Test [0affcc66-ac2b-aa83-7d57-000000001748] 30582 1726855358.42162: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001748 30582 1726855358.42522: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001748 30582 1726855358.42526: WORKER PROCESS EXITING 30582 1726855358.42558: no more pending results, returning what we have 30582 1726855358.42566: in VariableManager get_vars() 30582 1726855358.42653: Calling all_inventory to load vars for managed_node3 30582 1726855358.42658: Calling groups_inventory to load vars for managed_node3 30582 1726855358.42665: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855358.42681: Calling all_plugins_play to load vars for managed_node3 30582 1726855358.42685: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855358.42690: Calling groups_plugins_play to load vars for managed_node3 30582 1726855358.47354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855358.50810: done with get_vars() 30582 1726855358.50840: variable 'ansible_search_path' from source: unknown 30582 1726855358.50842: variable 'ansible_search_path' from source: unknown 30582 1726855358.50893: we have included files to process 30582 1726855358.50894: generating all_blocks data 30582 1726855358.50897: done generating all_blocks data 30582 1726855358.50902: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30582 1726855358.50903: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30582 1726855358.50907: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30582 1726855358.51306: done processing included file 30582 1726855358.51309: iterating over new_blocks loaded from include file 30582 1726855358.51310: in VariableManager get_vars() 30582 1726855358.51329: done with get_vars() 30582 1726855358.51331: filtering new block on tags 30582 1726855358.51356: done filtering new block on tags 30582 1726855358.51358: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node3 => (item=tasks/remove+down_profile.yml) 30582 1726855358.51366: extending task lists for all hosts with included blocks 30582 1726855358.53386: done extending task lists 30582 1726855358.53591: done processing included files 30582 1726855358.53592: results queue empty 30582 1726855358.53593: checking for any_errors_fatal 30582 1726855358.53595: done checking for any_errors_fatal 30582 1726855358.53596: checking for max_fail_percentage 30582 1726855358.53597: done checking for max_fail_percentage 30582 1726855358.53598: checking to see if all hosts have failed and the running result is not ok 30582 1726855358.53599: done checking to see if all hosts have failed 30582 1726855358.53600: getting the remaining hosts for this loop 30582 1726855358.53601: done getting the remaining hosts for this loop 30582 1726855358.53604: getting the next task for host managed_node3 30582 1726855358.53609: done getting next task for host managed_node3 30582 1726855358.53611: ^ task is: TASK: Include network role 30582 1726855358.53614: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855358.53618: getting variables 30582 1726855358.53619: in VariableManager get_vars() 30582 1726855358.53635: Calling all_inventory to load vars for managed_node3 30582 1726855358.53637: Calling groups_inventory to load vars for managed_node3 30582 1726855358.53640: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855358.53646: Calling all_plugins_play to load vars for managed_node3 30582 1726855358.53648: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855358.53651: Calling groups_plugins_play to load vars for managed_node3 30582 1726855358.56416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855358.59652: done with get_vars() 30582 1726855358.59693: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 14:02:38 -0400 (0:00:00.206) 0:01:34.949 ****** 30582 1726855358.59997: entering _queue_task() for managed_node3/include_role 30582 1726855358.60574: worker is 1 (out of 1 available) 30582 1726855358.60585: exiting _queue_task() for managed_node3/include_role 30582 1726855358.60702: done queuing things up, now waiting for results queue to drain 30582 1726855358.60704: waiting for pending results... 30582 1726855358.61707: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855358.62093: in run() - task 0affcc66-ac2b-aa83-7d57-000000001ca9 30582 1726855358.62097: variable 'ansible_search_path' from source: unknown 30582 1726855358.62100: variable 'ansible_search_path' from source: unknown 30582 1726855358.62103: calling self._execute() 30582 1726855358.62105: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855358.62108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855358.62110: variable 'omit' from source: magic vars 30582 1726855358.63439: variable 'ansible_distribution_major_version' from source: facts 30582 1726855358.63459: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855358.63893: _execute() done 30582 1726855358.63897: dumping result to json 30582 1726855358.63899: done dumping result, returning 30582 1726855358.63901: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-000000001ca9] 30582 1726855358.63904: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001ca9 30582 1726855358.64195: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001ca9 30582 1726855358.64230: no more pending results, returning what we have 30582 1726855358.64237: in VariableManager get_vars() 30582 1726855358.64497: Calling all_inventory to load vars for managed_node3 30582 1726855358.64500: Calling groups_inventory to load vars for managed_node3 30582 1726855358.64504: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855358.64518: Calling all_plugins_play to load vars for managed_node3 30582 1726855358.64522: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855358.64525: Calling groups_plugins_play to load vars for managed_node3 30582 1726855358.65394: WORKER PROCESS EXITING 30582 1726855358.67473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855358.70736: done with get_vars() 30582 1726855358.70769: variable 'ansible_search_path' from source: unknown 30582 1726855358.70771: variable 'ansible_search_path' from source: unknown 30582 1726855358.71122: variable 'omit' from source: magic vars 30582 1726855358.71161: variable 'omit' from source: magic vars 30582 1726855358.71179: variable 'omit' from source: magic vars 30582 1726855358.71183: we have included files to process 30582 1726855358.71184: generating all_blocks data 30582 1726855358.71186: done generating all_blocks data 30582 1726855358.71391: processing included file: fedora.linux_system_roles.network 30582 1726855358.71412: in VariableManager get_vars() 30582 1726855358.71430: done with get_vars() 30582 1726855358.71459: in VariableManager get_vars() 30582 1726855358.71480: done with get_vars() 30582 1726855358.71522: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855358.71838: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855358.72126: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855358.73030: in VariableManager get_vars() 30582 1726855358.73056: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855358.77100: iterating over new_blocks loaded from include file 30582 1726855358.77103: in VariableManager get_vars() 30582 1726855358.77125: done with get_vars() 30582 1726855358.77127: filtering new block on tags 30582 1726855358.77810: done filtering new block on tags 30582 1726855358.77814: in VariableManager get_vars() 30582 1726855358.77832: done with get_vars() 30582 1726855358.77834: filtering new block on tags 30582 1726855358.77850: done filtering new block on tags 30582 1726855358.77854: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855358.77860: extending task lists for all hosts with included blocks 30582 1726855358.77975: done extending task lists 30582 1726855358.77976: done processing included files 30582 1726855358.77977: results queue empty 30582 1726855358.77978: checking for any_errors_fatal 30582 1726855358.77983: done checking for any_errors_fatal 30582 1726855358.77984: checking for max_fail_percentage 30582 1726855358.77985: done checking for max_fail_percentage 30582 1726855358.77986: checking to see if all hosts have failed and the running result is not ok 30582 1726855358.77988: done checking to see if all hosts have failed 30582 1726855358.78191: getting the remaining hosts for this loop 30582 1726855358.78192: done getting the remaining hosts for this loop 30582 1726855358.78196: getting the next task for host managed_node3 30582 1726855358.78201: done getting next task for host managed_node3 30582 1726855358.78204: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855358.78207: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855358.78218: getting variables 30582 1726855358.78220: in VariableManager get_vars() 30582 1726855358.78234: Calling all_inventory to load vars for managed_node3 30582 1726855358.78236: Calling groups_inventory to load vars for managed_node3 30582 1726855358.78238: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855358.78244: Calling all_plugins_play to load vars for managed_node3 30582 1726855358.78246: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855358.78249: Calling groups_plugins_play to load vars for managed_node3 30582 1726855358.80710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855358.84274: done with get_vars() 30582 1726855358.84301: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:38 -0400 (0:00:00.243) 0:01:35.193 ****** 30582 1726855358.84384: entering _queue_task() for managed_node3/include_tasks 30582 1726855358.85391: worker is 1 (out of 1 available) 30582 1726855358.85404: exiting _queue_task() for managed_node3/include_tasks 30582 1726855358.85414: done queuing things up, now waiting for results queue to drain 30582 1726855358.85416: waiting for pending results... 30582 1726855358.86119: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855358.86244: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d2b 30582 1726855358.86257: variable 'ansible_search_path' from source: unknown 30582 1726855358.86262: variable 'ansible_search_path' from source: unknown 30582 1726855358.86414: calling self._execute() 30582 1726855358.86551: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855358.86555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855358.86564: variable 'omit' from source: magic vars 30582 1726855358.87360: variable 'ansible_distribution_major_version' from source: facts 30582 1726855358.87375: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855358.87381: _execute() done 30582 1726855358.87385: dumping result to json 30582 1726855358.87394: done dumping result, returning 30582 1726855358.87693: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-000000001d2b] 30582 1726855358.87697: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2b 30582 1726855358.87765: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2b 30582 1726855358.87770: WORKER PROCESS EXITING 30582 1726855358.87828: no more pending results, returning what we have 30582 1726855358.87835: in VariableManager get_vars() 30582 1726855358.87896: Calling all_inventory to load vars for managed_node3 30582 1726855358.87900: Calling groups_inventory to load vars for managed_node3 30582 1726855358.87902: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855358.87916: Calling all_plugins_play to load vars for managed_node3 30582 1726855358.87920: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855358.87922: Calling groups_plugins_play to load vars for managed_node3 30582 1726855358.90831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855358.94081: done with get_vars() 30582 1726855358.94313: variable 'ansible_search_path' from source: unknown 30582 1726855358.94315: variable 'ansible_search_path' from source: unknown 30582 1726855358.94354: we have included files to process 30582 1726855358.94355: generating all_blocks data 30582 1726855358.94357: done generating all_blocks data 30582 1726855358.94360: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855358.94361: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855358.94367: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855358.95551: done processing included file 30582 1726855358.95553: iterating over new_blocks loaded from include file 30582 1726855358.95554: in VariableManager get_vars() 30582 1726855358.95589: done with get_vars() 30582 1726855358.95591: filtering new block on tags 30582 1726855358.95623: done filtering new block on tags 30582 1726855358.95626: in VariableManager get_vars() 30582 1726855358.95649: done with get_vars() 30582 1726855358.95650: filtering new block on tags 30582 1726855358.95904: done filtering new block on tags 30582 1726855358.95907: in VariableManager get_vars() 30582 1726855358.95933: done with get_vars() 30582 1726855358.95935: filtering new block on tags 30582 1726855358.95982: done filtering new block on tags 30582 1726855358.95985: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855358.95992: extending task lists for all hosts with included blocks 30582 1726855358.99618: done extending task lists 30582 1726855358.99620: done processing included files 30582 1726855358.99621: results queue empty 30582 1726855358.99621: checking for any_errors_fatal 30582 1726855358.99625: done checking for any_errors_fatal 30582 1726855358.99626: checking for max_fail_percentage 30582 1726855358.99627: done checking for max_fail_percentage 30582 1726855358.99628: checking to see if all hosts have failed and the running result is not ok 30582 1726855358.99628: done checking to see if all hosts have failed 30582 1726855358.99629: getting the remaining hosts for this loop 30582 1726855358.99631: done getting the remaining hosts for this loop 30582 1726855358.99633: getting the next task for host managed_node3 30582 1726855358.99639: done getting next task for host managed_node3 30582 1726855358.99642: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855358.99646: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855358.99657: getting variables 30582 1726855358.99659: in VariableManager get_vars() 30582 1726855358.99679: Calling all_inventory to load vars for managed_node3 30582 1726855358.99682: Calling groups_inventory to load vars for managed_node3 30582 1726855358.99684: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855358.99893: Calling all_plugins_play to load vars for managed_node3 30582 1726855358.99897: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855358.99901: Calling groups_plugins_play to load vars for managed_node3 30582 1726855359.02358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855359.05613: done with get_vars() 30582 1726855359.05653: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:02:39 -0400 (0:00:00.214) 0:01:35.408 ****** 30582 1726855359.05871: entering _queue_task() for managed_node3/setup 30582 1726855359.06675: worker is 1 (out of 1 available) 30582 1726855359.07090: exiting _queue_task() for managed_node3/setup 30582 1726855359.07102: done queuing things up, now waiting for results queue to drain 30582 1726855359.07104: waiting for pending results... 30582 1726855359.07607: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855359.07847: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d82 30582 1726855359.07862: variable 'ansible_search_path' from source: unknown 30582 1726855359.07866: variable 'ansible_search_path' from source: unknown 30582 1726855359.07992: calling self._execute() 30582 1726855359.08233: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855359.08239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855359.08249: variable 'omit' from source: magic vars 30582 1726855359.09144: variable 'ansible_distribution_major_version' from source: facts 30582 1726855359.09155: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855359.09696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855359.13194: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855359.13198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855359.13201: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855359.13221: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855359.13247: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855359.13335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855359.13364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855359.13400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855359.13439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855359.13455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855359.13517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855359.13541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855359.13565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855359.13614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855359.13628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855359.13955: variable '__network_required_facts' from source: role '' defaults 30582 1726855359.13966: variable 'ansible_facts' from source: unknown 30582 1726855359.15711: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855359.15715: when evaluation is False, skipping this task 30582 1726855359.15718: _execute() done 30582 1726855359.15723: dumping result to json 30582 1726855359.15725: done dumping result, returning 30582 1726855359.15736: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-000000001d82] 30582 1726855359.15739: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d82 30582 1726855359.16000: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d82 30582 1726855359.16004: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855359.16051: no more pending results, returning what we have 30582 1726855359.16055: results queue empty 30582 1726855359.16056: checking for any_errors_fatal 30582 1726855359.16058: done checking for any_errors_fatal 30582 1726855359.16058: checking for max_fail_percentage 30582 1726855359.16060: done checking for max_fail_percentage 30582 1726855359.16061: checking to see if all hosts have failed and the running result is not ok 30582 1726855359.16062: done checking to see if all hosts have failed 30582 1726855359.16063: getting the remaining hosts for this loop 30582 1726855359.16067: done getting the remaining hosts for this loop 30582 1726855359.16071: getting the next task for host managed_node3 30582 1726855359.16082: done getting next task for host managed_node3 30582 1726855359.16086: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855359.16094: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855359.16118: getting variables 30582 1726855359.16120: in VariableManager get_vars() 30582 1726855359.16167: Calling all_inventory to load vars for managed_node3 30582 1726855359.16170: Calling groups_inventory to load vars for managed_node3 30582 1726855359.16173: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855359.16183: Calling all_plugins_play to load vars for managed_node3 30582 1726855359.16191: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855359.16201: Calling groups_plugins_play to load vars for managed_node3 30582 1726855359.17960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855359.21951: done with get_vars() 30582 1726855359.22111: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:02:39 -0400 (0:00:00.166) 0:01:35.575 ****** 30582 1726855359.22537: entering _queue_task() for managed_node3/stat 30582 1726855359.23760: worker is 1 (out of 1 available) 30582 1726855359.23777: exiting _queue_task() for managed_node3/stat 30582 1726855359.23793: done queuing things up, now waiting for results queue to drain 30582 1726855359.23795: waiting for pending results... 30582 1726855359.24301: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855359.24870: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d84 30582 1726855359.24874: variable 'ansible_search_path' from source: unknown 30582 1726855359.24877: variable 'ansible_search_path' from source: unknown 30582 1726855359.24977: calling self._execute() 30582 1726855359.25195: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855359.25199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855359.25202: variable 'omit' from source: magic vars 30582 1726855359.25863: variable 'ansible_distribution_major_version' from source: facts 30582 1726855359.25883: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855359.26173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855359.26904: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855359.26958: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855359.27078: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855359.27120: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855359.27694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855359.27698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855359.27702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855359.27705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855359.28293: variable '__network_is_ostree' from source: set_fact 30582 1726855359.28296: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855359.28299: when evaluation is False, skipping this task 30582 1726855359.28300: _execute() done 30582 1726855359.28302: dumping result to json 30582 1726855359.28304: done dumping result, returning 30582 1726855359.28306: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-000000001d84] 30582 1726855359.28308: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d84 30582 1726855359.28382: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d84 30582 1726855359.28386: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855359.28444: no more pending results, returning what we have 30582 1726855359.28449: results queue empty 30582 1726855359.28450: checking for any_errors_fatal 30582 1726855359.28456: done checking for any_errors_fatal 30582 1726855359.28457: checking for max_fail_percentage 30582 1726855359.28460: done checking for max_fail_percentage 30582 1726855359.28461: checking to see if all hosts have failed and the running result is not ok 30582 1726855359.28462: done checking to see if all hosts have failed 30582 1726855359.28463: getting the remaining hosts for this loop 30582 1726855359.28467: done getting the remaining hosts for this loop 30582 1726855359.28473: getting the next task for host managed_node3 30582 1726855359.28481: done getting next task for host managed_node3 30582 1726855359.28486: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855359.28494: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855359.28523: getting variables 30582 1726855359.28525: in VariableManager get_vars() 30582 1726855359.28581: Calling all_inventory to load vars for managed_node3 30582 1726855359.28584: Calling groups_inventory to load vars for managed_node3 30582 1726855359.28587: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855359.28708: Calling all_plugins_play to load vars for managed_node3 30582 1726855359.28712: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855359.28715: Calling groups_plugins_play to load vars for managed_node3 30582 1726855359.31834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855359.35457: done with get_vars() 30582 1726855359.35500: done getting variables 30582 1726855359.35570: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:02:39 -0400 (0:00:00.132) 0:01:35.708 ****** 30582 1726855359.35827: entering _queue_task() for managed_node3/set_fact 30582 1726855359.36631: worker is 1 (out of 1 available) 30582 1726855359.36644: exiting _queue_task() for managed_node3/set_fact 30582 1726855359.36657: done queuing things up, now waiting for results queue to drain 30582 1726855359.36659: waiting for pending results... 30582 1726855359.37226: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855359.37694: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d85 30582 1726855359.37697: variable 'ansible_search_path' from source: unknown 30582 1726855359.37700: variable 'ansible_search_path' from source: unknown 30582 1726855359.37993: calling self._execute() 30582 1726855359.38193: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855359.38197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855359.38200: variable 'omit' from source: magic vars 30582 1726855359.38746: variable 'ansible_distribution_major_version' from source: facts 30582 1726855359.38807: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855359.39209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855359.39711: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855359.39875: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855359.39912: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855359.40013: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855359.40177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855359.40392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855359.40396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855359.40398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855359.40531: variable '__network_is_ostree' from source: set_fact 30582 1726855359.40540: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855359.40660: when evaluation is False, skipping this task 30582 1726855359.40664: _execute() done 30582 1726855359.40669: dumping result to json 30582 1726855359.40674: done dumping result, returning 30582 1726855359.40685: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-000000001d85] 30582 1726855359.40694: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d85 30582 1726855359.40797: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d85 30582 1726855359.40802: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855359.40859: no more pending results, returning what we have 30582 1726855359.40869: results queue empty 30582 1726855359.40870: checking for any_errors_fatal 30582 1726855359.40878: done checking for any_errors_fatal 30582 1726855359.40879: checking for max_fail_percentage 30582 1726855359.40881: done checking for max_fail_percentage 30582 1726855359.40882: checking to see if all hosts have failed and the running result is not ok 30582 1726855359.40883: done checking to see if all hosts have failed 30582 1726855359.40884: getting the remaining hosts for this loop 30582 1726855359.40886: done getting the remaining hosts for this loop 30582 1726855359.40892: getting the next task for host managed_node3 30582 1726855359.40904: done getting next task for host managed_node3 30582 1726855359.40908: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855359.40914: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855359.40944: getting variables 30582 1726855359.40946: in VariableManager get_vars() 30582 1726855359.41202: Calling all_inventory to load vars for managed_node3 30582 1726855359.41205: Calling groups_inventory to load vars for managed_node3 30582 1726855359.41207: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855359.41220: Calling all_plugins_play to load vars for managed_node3 30582 1726855359.41223: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855359.41226: Calling groups_plugins_play to load vars for managed_node3 30582 1726855359.45525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855359.49070: done with get_vars() 30582 1726855359.49211: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:02:39 -0400 (0:00:00.137) 0:01:35.846 ****** 30582 1726855359.49616: entering _queue_task() for managed_node3/service_facts 30582 1726855359.50921: worker is 1 (out of 1 available) 30582 1726855359.50929: exiting _queue_task() for managed_node3/service_facts 30582 1726855359.50939: done queuing things up, now waiting for results queue to drain 30582 1726855359.50940: waiting for pending results... 30582 1726855359.51091: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855359.51500: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d87 30582 1726855359.51617: variable 'ansible_search_path' from source: unknown 30582 1726855359.51621: variable 'ansible_search_path' from source: unknown 30582 1726855359.51624: calling self._execute() 30582 1726855359.51759: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855359.51763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855359.51777: variable 'omit' from source: magic vars 30582 1726855359.53189: variable 'ansible_distribution_major_version' from source: facts 30582 1726855359.53202: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855359.53208: variable 'omit' from source: magic vars 30582 1726855359.53403: variable 'omit' from source: magic vars 30582 1726855359.53436: variable 'omit' from source: magic vars 30582 1726855359.53481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855359.53823: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855359.53844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855359.53862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855359.53877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855359.54020: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855359.54024: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855359.54027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855359.54398: Set connection var ansible_timeout to 10 30582 1726855359.54401: Set connection var ansible_connection to ssh 30582 1726855359.54435: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855359.54437: Set connection var ansible_pipelining to False 30582 1726855359.54440: Set connection var ansible_shell_executable to /bin/sh 30582 1726855359.54442: Set connection var ansible_shell_type to sh 30582 1726855359.54558: variable 'ansible_shell_executable' from source: unknown 30582 1726855359.54562: variable 'ansible_connection' from source: unknown 30582 1726855359.54565: variable 'ansible_module_compression' from source: unknown 30582 1726855359.54568: variable 'ansible_shell_type' from source: unknown 30582 1726855359.54591: variable 'ansible_shell_executable' from source: unknown 30582 1726855359.54594: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855359.54596: variable 'ansible_pipelining' from source: unknown 30582 1726855359.54598: variable 'ansible_timeout' from source: unknown 30582 1726855359.54600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855359.55215: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855359.55294: variable 'omit' from source: magic vars 30582 1726855359.55298: starting attempt loop 30582 1726855359.55301: running the handler 30582 1726855359.55303: _low_level_execute_command(): starting 30582 1726855359.55306: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855359.56933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855359.56937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855359.57003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855359.57086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855359.58788: stdout chunk (state=3): >>>/root <<< 30582 1726855359.58889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855359.58938: stderr chunk (state=3): >>><<< 30582 1726855359.58941: stdout chunk (state=3): >>><<< 30582 1726855359.59145: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855359.59150: _low_level_execute_command(): starting 30582 1726855359.59153: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366 `" && echo ansible-tmp-1726855359.5910492-35058-168676809838366="` echo /root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366 `" ) && sleep 0' 30582 1726855359.60695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855359.60707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855359.60710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855359.60809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855359.60939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855359.62858: stdout chunk (state=3): >>>ansible-tmp-1726855359.5910492-35058-168676809838366=/root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366 <<< 30582 1726855359.62971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855359.63100: stderr chunk (state=3): >>><<< 30582 1726855359.63110: stdout chunk (state=3): >>><<< 30582 1726855359.63132: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855359.5910492-35058-168676809838366=/root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855359.63185: variable 'ansible_module_compression' from source: unknown 30582 1726855359.63236: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855359.63272: variable 'ansible_facts' from source: unknown 30582 1726855359.63528: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/AnsiballZ_service_facts.py 30582 1726855359.63531: Sending initial data 30582 1726855359.63533: Sent initial data (162 bytes) 30582 1726855359.64203: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855359.64210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855359.64279: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855359.64321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855359.64324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855359.64327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855359.64554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855359.66121: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855359.66183: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855359.66235: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpzwk2452m /root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/AnsiballZ_service_facts.py <<< 30582 1726855359.66239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/AnsiballZ_service_facts.py" <<< 30582 1726855359.66295: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpzwk2452m" to remote "/root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/AnsiballZ_service_facts.py" <<< 30582 1726855359.67710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855359.67714: stdout chunk (state=3): >>><<< 30582 1726855359.67721: stderr chunk (state=3): >>><<< 30582 1726855359.67739: done transferring module to remote 30582 1726855359.67902: _low_level_execute_command(): starting 30582 1726855359.67908: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/ /root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/AnsiballZ_service_facts.py && sleep 0' 30582 1726855359.69495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855359.69603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855359.69706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855359.70102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855359.70196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855359.72056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855359.72111: stderr chunk (state=3): >>><<< 30582 1726855359.72119: stdout chunk (state=3): >>><<< 30582 1726855359.72159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855359.72163: _low_level_execute_command(): starting 30582 1726855359.72169: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/AnsiballZ_service_facts.py && sleep 0' 30582 1726855359.73531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855359.73641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855359.73696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855359.73755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855359.73860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855359.73964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855361.34668: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 30582 1726855361.34711: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 30582 1726855361.34741: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive",<<< 30582 1726855361.34764: stdout chunk (state=3): >>> "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "st<<< 30582 1726855361.34770: stdout chunk (state=3): >>>ate": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855361.36493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855361.36497: stdout chunk (state=3): >>><<< 30582 1726855361.36499: stderr chunk (state=3): >>><<< 30582 1726855361.36507: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855361.37182: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855361.37194: _low_level_execute_command(): starting 30582 1726855361.37197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855359.5910492-35058-168676809838366/ > /dev/null 2>&1 && sleep 0' 30582 1726855361.37794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855361.37798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855361.37800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855361.37805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855361.37940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855361.37943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855361.38007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855361.40080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855361.40084: stdout chunk (state=3): >>><<< 30582 1726855361.40089: stderr chunk (state=3): >>><<< 30582 1726855361.40107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855361.40293: handler run complete 30582 1726855361.40347: variable 'ansible_facts' from source: unknown 30582 1726855361.40551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855361.41092: variable 'ansible_facts' from source: unknown 30582 1726855361.41235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855361.41465: attempt loop complete, returning result 30582 1726855361.41477: _execute() done 30582 1726855361.41483: dumping result to json 30582 1726855361.41563: done dumping result, returning 30582 1726855361.41577: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000001d87] 30582 1726855361.41585: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d87 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855361.43373: no more pending results, returning what we have 30582 1726855361.43376: results queue empty 30582 1726855361.43377: checking for any_errors_fatal 30582 1726855361.43381: done checking for any_errors_fatal 30582 1726855361.43382: checking for max_fail_percentage 30582 1726855361.43384: done checking for max_fail_percentage 30582 1726855361.43385: checking to see if all hosts have failed and the running result is not ok 30582 1726855361.43386: done checking to see if all hosts have failed 30582 1726855361.43389: getting the remaining hosts for this loop 30582 1726855361.43390: done getting the remaining hosts for this loop 30582 1726855361.43394: getting the next task for host managed_node3 30582 1726855361.43409: done getting next task for host managed_node3 30582 1726855361.43413: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855361.43419: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855361.43432: getting variables 30582 1726855361.43434: in VariableManager get_vars() 30582 1726855361.43470: Calling all_inventory to load vars for managed_node3 30582 1726855361.43473: Calling groups_inventory to load vars for managed_node3 30582 1726855361.43476: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855361.43485: Calling all_plugins_play to load vars for managed_node3 30582 1726855361.43623: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855361.43630: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d87 30582 1726855361.43634: WORKER PROCESS EXITING 30582 1726855361.43638: Calling groups_plugins_play to load vars for managed_node3 30582 1726855361.45521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855361.57265: done with get_vars() 30582 1726855361.57302: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:02:41 -0400 (0:00:02.077) 0:01:37.923 ****** 30582 1726855361.57400: entering _queue_task() for managed_node3/package_facts 30582 1726855361.57911: worker is 1 (out of 1 available) 30582 1726855361.57925: exiting _queue_task() for managed_node3/package_facts 30582 1726855361.57937: done queuing things up, now waiting for results queue to drain 30582 1726855361.57939: waiting for pending results... 30582 1726855361.58266: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855361.58418: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d88 30582 1726855361.58431: variable 'ansible_search_path' from source: unknown 30582 1726855361.58435: variable 'ansible_search_path' from source: unknown 30582 1726855361.58467: calling self._execute() 30582 1726855361.58604: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855361.58609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855361.58612: variable 'omit' from source: magic vars 30582 1726855361.58978: variable 'ansible_distribution_major_version' from source: facts 30582 1726855361.58992: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855361.58996: variable 'omit' from source: magic vars 30582 1726855361.59133: variable 'omit' from source: magic vars 30582 1726855361.59312: variable 'omit' from source: magic vars 30582 1726855361.59316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855361.59320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855361.59342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855361.59359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855361.59374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855361.59589: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855361.59592: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855361.59595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855361.59783: Set connection var ansible_timeout to 10 30582 1726855361.59788: Set connection var ansible_connection to ssh 30582 1726855361.59802: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855361.59805: Set connection var ansible_pipelining to False 30582 1726855361.59811: Set connection var ansible_shell_executable to /bin/sh 30582 1726855361.59813: Set connection var ansible_shell_type to sh 30582 1726855361.59837: variable 'ansible_shell_executable' from source: unknown 30582 1726855361.59840: variable 'ansible_connection' from source: unknown 30582 1726855361.59843: variable 'ansible_module_compression' from source: unknown 30582 1726855361.59847: variable 'ansible_shell_type' from source: unknown 30582 1726855361.59849: variable 'ansible_shell_executable' from source: unknown 30582 1726855361.59853: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855361.59855: variable 'ansible_pipelining' from source: unknown 30582 1726855361.59857: variable 'ansible_timeout' from source: unknown 30582 1726855361.59860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855361.60466: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855361.60471: variable 'omit' from source: magic vars 30582 1726855361.60474: starting attempt loop 30582 1726855361.60476: running the handler 30582 1726855361.60483: _low_level_execute_command(): starting 30582 1726855361.60486: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855361.61322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855361.61333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855361.61336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855361.61338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855361.61400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855361.63452: stdout chunk (state=3): >>>/root <<< 30582 1726855361.63456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855361.63459: stdout chunk (state=3): >>><<< 30582 1726855361.63461: stderr chunk (state=3): >>><<< 30582 1726855361.63467: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855361.63469: _low_level_execute_command(): starting 30582 1726855361.63472: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358 `" && echo ansible-tmp-1726855361.6334183-35129-51727590433358="` echo /root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358 `" ) && sleep 0' 30582 1726855361.64428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855361.64608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855361.64619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855361.64642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855361.64646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855361.64705: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855361.65396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855361.65406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855361.67410: stdout chunk (state=3): >>>ansible-tmp-1726855361.6334183-35129-51727590433358=/root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358 <<< 30582 1726855361.67554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855361.67557: stdout chunk (state=3): >>><<< 30582 1726855361.67567: stderr chunk (state=3): >>><<< 30582 1726855361.67584: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855361.6334183-35129-51727590433358=/root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855361.67636: variable 'ansible_module_compression' from source: unknown 30582 1726855361.67685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855361.67765: variable 'ansible_facts' from source: unknown 30582 1726855361.68389: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/AnsiballZ_package_facts.py 30582 1726855361.68546: Sending initial data 30582 1726855361.68554: Sent initial data (161 bytes) 30582 1726855361.70075: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855361.70080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855361.70083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855361.70085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855361.70319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855361.70377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855361.72026: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855361.72111: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855361.72171: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp94ucvhfx /root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/AnsiballZ_package_facts.py <<< 30582 1726855361.72179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/AnsiballZ_package_facts.py" <<< 30582 1726855361.72323: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp94ucvhfx" to remote "/root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/AnsiballZ_package_facts.py" <<< 30582 1726855361.74228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855361.74496: stderr chunk (state=3): >>><<< 30582 1726855361.74499: stdout chunk (state=3): >>><<< 30582 1726855361.74502: done transferring module to remote 30582 1726855361.74505: _low_level_execute_command(): starting 30582 1726855361.74507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/ /root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/AnsiballZ_package_facts.py && sleep 0' 30582 1726855361.75019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855361.75035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855361.75046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855361.75194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855361.75406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855361.75501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855361.77330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855361.77372: stderr chunk (state=3): >>><<< 30582 1726855361.77375: stdout chunk (state=3): >>><<< 30582 1726855361.77406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855361.77409: _low_level_execute_command(): starting 30582 1726855361.77593: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/AnsiballZ_package_facts.py && sleep 0' 30582 1726855361.78060: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855361.78069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855361.78079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855361.78096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855361.78108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855361.78116: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855361.78125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855361.78139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855361.78161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855361.78174: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855361.78198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855361.78296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855361.78320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855361.78332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855361.78342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855361.78455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855362.23222: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30582 1726855362.23359: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30582 1726855362.23436: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30582 1726855362.23450: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855362.25385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855362.25392: stdout chunk (state=3): >>><<< 30582 1726855362.25396: stderr chunk (state=3): >>><<< 30582 1726855362.25708: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855362.27816: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855362.27835: _low_level_execute_command(): starting 30582 1726855362.27839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855361.6334183-35129-51727590433358/ > /dev/null 2>&1 && sleep 0' 30582 1726855362.28466: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855362.28473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855362.28509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855362.28595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855362.28627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855362.28718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855362.30624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855362.30642: stdout chunk (state=3): >>><<< 30582 1726855362.30653: stderr chunk (state=3): >>><<< 30582 1726855362.30673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855362.30682: handler run complete 30582 1726855362.31893: variable 'ansible_facts' from source: unknown 30582 1726855362.32642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855362.36197: variable 'ansible_facts' from source: unknown 30582 1726855362.37131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855362.38589: attempt loop complete, returning result 30582 1726855362.38726: _execute() done 30582 1726855362.38742: dumping result to json 30582 1726855362.39293: done dumping result, returning 30582 1726855362.39297: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000001d88] 30582 1726855362.39299: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d88 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855362.43000: no more pending results, returning what we have 30582 1726855362.43004: results queue empty 30582 1726855362.43005: checking for any_errors_fatal 30582 1726855362.43012: done checking for any_errors_fatal 30582 1726855362.43013: checking for max_fail_percentage 30582 1726855362.43015: done checking for max_fail_percentage 30582 1726855362.43016: checking to see if all hosts have failed and the running result is not ok 30582 1726855362.43017: done checking to see if all hosts have failed 30582 1726855362.43017: getting the remaining hosts for this loop 30582 1726855362.43019: done getting the remaining hosts for this loop 30582 1726855362.43022: getting the next task for host managed_node3 30582 1726855362.43031: done getting next task for host managed_node3 30582 1726855362.43034: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855362.43039: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855362.43108: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d88 30582 1726855362.43112: WORKER PROCESS EXITING 30582 1726855362.43121: getting variables 30582 1726855362.43123: in VariableManager get_vars() 30582 1726855362.43158: Calling all_inventory to load vars for managed_node3 30582 1726855362.43161: Calling groups_inventory to load vars for managed_node3 30582 1726855362.43166: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855362.43177: Calling all_plugins_play to load vars for managed_node3 30582 1726855362.43180: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855362.43184: Calling groups_plugins_play to load vars for managed_node3 30582 1726855362.46447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855362.49951: done with get_vars() 30582 1726855362.50110: done getting variables 30582 1726855362.50174: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:42 -0400 (0:00:00.929) 0:01:38.853 ****** 30582 1726855362.50332: entering _queue_task() for managed_node3/debug 30582 1726855362.51119: worker is 1 (out of 1 available) 30582 1726855362.51135: exiting _queue_task() for managed_node3/debug 30582 1726855362.51149: done queuing things up, now waiting for results queue to drain 30582 1726855362.51151: waiting for pending results... 30582 1726855362.51567: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855362.51906: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d2c 30582 1726855362.51922: variable 'ansible_search_path' from source: unknown 30582 1726855362.51925: variable 'ansible_search_path' from source: unknown 30582 1726855362.51961: calling self._execute() 30582 1726855362.52055: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855362.52059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855362.52160: variable 'omit' from source: magic vars 30582 1726855362.52858: variable 'ansible_distribution_major_version' from source: facts 30582 1726855362.52871: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855362.52877: variable 'omit' from source: magic vars 30582 1726855362.53078: variable 'omit' from source: magic vars 30582 1726855362.53177: variable 'network_provider' from source: set_fact 30582 1726855362.53198: variable 'omit' from source: magic vars 30582 1726855362.53365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855362.53369: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855362.53372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855362.53375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855362.53378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855362.53380: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855362.53382: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855362.53385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855362.53470: Set connection var ansible_timeout to 10 30582 1726855362.53473: Set connection var ansible_connection to ssh 30582 1726855362.53479: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855362.53483: Set connection var ansible_pipelining to False 30582 1726855362.53490: Set connection var ansible_shell_executable to /bin/sh 30582 1726855362.53493: Set connection var ansible_shell_type to sh 30582 1726855362.53515: variable 'ansible_shell_executable' from source: unknown 30582 1726855362.53523: variable 'ansible_connection' from source: unknown 30582 1726855362.53526: variable 'ansible_module_compression' from source: unknown 30582 1726855362.53528: variable 'ansible_shell_type' from source: unknown 30582 1726855362.53531: variable 'ansible_shell_executable' from source: unknown 30582 1726855362.53533: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855362.53535: variable 'ansible_pipelining' from source: unknown 30582 1726855362.53538: variable 'ansible_timeout' from source: unknown 30582 1726855362.53543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855362.53897: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855362.53900: variable 'omit' from source: magic vars 30582 1726855362.53902: starting attempt loop 30582 1726855362.53904: running the handler 30582 1726855362.53906: handler run complete 30582 1726855362.53907: attempt loop complete, returning result 30582 1726855362.53909: _execute() done 30582 1726855362.53910: dumping result to json 30582 1726855362.53912: done dumping result, returning 30582 1726855362.53914: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-000000001d2c] 30582 1726855362.53916: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2c 30582 1726855362.53979: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2c 30582 1726855362.53982: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855362.54062: no more pending results, returning what we have 30582 1726855362.54068: results queue empty 30582 1726855362.54069: checking for any_errors_fatal 30582 1726855362.54076: done checking for any_errors_fatal 30582 1726855362.54076: checking for max_fail_percentage 30582 1726855362.54078: done checking for max_fail_percentage 30582 1726855362.54079: checking to see if all hosts have failed and the running result is not ok 30582 1726855362.54080: done checking to see if all hosts have failed 30582 1726855362.54080: getting the remaining hosts for this loop 30582 1726855362.54082: done getting the remaining hosts for this loop 30582 1726855362.54085: getting the next task for host managed_node3 30582 1726855362.54094: done getting next task for host managed_node3 30582 1726855362.54097: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855362.54102: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855362.54227: getting variables 30582 1726855362.54229: in VariableManager get_vars() 30582 1726855362.54273: Calling all_inventory to load vars for managed_node3 30582 1726855362.54276: Calling groups_inventory to load vars for managed_node3 30582 1726855362.54278: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855362.54326: Calling all_plugins_play to load vars for managed_node3 30582 1726855362.54330: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855362.54334: Calling groups_plugins_play to load vars for managed_node3 30582 1726855362.57140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855362.58918: done with get_vars() 30582 1726855362.58945: done getting variables 30582 1726855362.59016: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:42 -0400 (0:00:00.087) 0:01:38.940 ****** 30582 1726855362.59072: entering _queue_task() for managed_node3/fail 30582 1726855362.59732: worker is 1 (out of 1 available) 30582 1726855362.59744: exiting _queue_task() for managed_node3/fail 30582 1726855362.59754: done queuing things up, now waiting for results queue to drain 30582 1726855362.59756: waiting for pending results... 30582 1726855362.60180: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855362.60504: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d2d 30582 1726855362.60528: variable 'ansible_search_path' from source: unknown 30582 1726855362.60797: variable 'ansible_search_path' from source: unknown 30582 1726855362.60800: calling self._execute() 30582 1726855362.60851: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855362.60865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855362.60881: variable 'omit' from source: magic vars 30582 1726855362.61707: variable 'ansible_distribution_major_version' from source: facts 30582 1726855362.61893: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855362.62207: variable 'network_state' from source: role '' defaults 30582 1726855362.62326: Evaluated conditional (network_state != {}): False 30582 1726855362.62330: when evaluation is False, skipping this task 30582 1726855362.62332: _execute() done 30582 1726855362.62334: dumping result to json 30582 1726855362.62337: done dumping result, returning 30582 1726855362.62339: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-000000001d2d] 30582 1726855362.62342: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2d 30582 1726855362.62421: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2d 30582 1726855362.62430: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855362.62485: no more pending results, returning what we have 30582 1726855362.62491: results queue empty 30582 1726855362.62493: checking for any_errors_fatal 30582 1726855362.62500: done checking for any_errors_fatal 30582 1726855362.62501: checking for max_fail_percentage 30582 1726855362.62504: done checking for max_fail_percentage 30582 1726855362.62505: checking to see if all hosts have failed and the running result is not ok 30582 1726855362.62506: done checking to see if all hosts have failed 30582 1726855362.62507: getting the remaining hosts for this loop 30582 1726855362.62508: done getting the remaining hosts for this loop 30582 1726855362.62512: getting the next task for host managed_node3 30582 1726855362.62521: done getting next task for host managed_node3 30582 1726855362.62525: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855362.62531: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855362.62919: getting variables 30582 1726855362.62921: in VariableManager get_vars() 30582 1726855362.62972: Calling all_inventory to load vars for managed_node3 30582 1726855362.62974: Calling groups_inventory to load vars for managed_node3 30582 1726855362.62977: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855362.62994: Calling all_plugins_play to load vars for managed_node3 30582 1726855362.62997: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855362.63000: Calling groups_plugins_play to load vars for managed_node3 30582 1726855362.66524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855362.70023: done with get_vars() 30582 1726855362.70179: done getting variables 30582 1726855362.70242: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:42 -0400 (0:00:00.113) 0:01:39.053 ****** 30582 1726855362.70403: entering _queue_task() for managed_node3/fail 30582 1726855362.71185: worker is 1 (out of 1 available) 30582 1726855362.71203: exiting _queue_task() for managed_node3/fail 30582 1726855362.71215: done queuing things up, now waiting for results queue to drain 30582 1726855362.71217: waiting for pending results... 30582 1726855362.71661: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855362.72015: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d2e 30582 1726855362.72029: variable 'ansible_search_path' from source: unknown 30582 1726855362.72032: variable 'ansible_search_path' from source: unknown 30582 1726855362.72073: calling self._execute() 30582 1726855362.72430: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855362.72433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855362.72448: variable 'omit' from source: magic vars 30582 1726855362.73716: variable 'ansible_distribution_major_version' from source: facts 30582 1726855362.73728: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855362.74059: variable 'network_state' from source: role '' defaults 30582 1726855362.74069: Evaluated conditional (network_state != {}): False 30582 1726855362.74073: when evaluation is False, skipping this task 30582 1726855362.74083: _execute() done 30582 1726855362.74086: dumping result to json 30582 1726855362.74090: done dumping result, returning 30582 1726855362.74093: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-000000001d2e] 30582 1726855362.74096: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2e 30582 1726855362.74272: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2e 30582 1726855362.74275: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855362.74334: no more pending results, returning what we have 30582 1726855362.74338: results queue empty 30582 1726855362.74339: checking for any_errors_fatal 30582 1726855362.74350: done checking for any_errors_fatal 30582 1726855362.74350: checking for max_fail_percentage 30582 1726855362.74353: done checking for max_fail_percentage 30582 1726855362.74354: checking to see if all hosts have failed and the running result is not ok 30582 1726855362.74354: done checking to see if all hosts have failed 30582 1726855362.74355: getting the remaining hosts for this loop 30582 1726855362.74356: done getting the remaining hosts for this loop 30582 1726855362.74360: getting the next task for host managed_node3 30582 1726855362.74370: done getting next task for host managed_node3 30582 1726855362.74375: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855362.74381: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855362.74411: getting variables 30582 1726855362.74413: in VariableManager get_vars() 30582 1726855362.74456: Calling all_inventory to load vars for managed_node3 30582 1726855362.74459: Calling groups_inventory to load vars for managed_node3 30582 1726855362.74461: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855362.74474: Calling all_plugins_play to load vars for managed_node3 30582 1726855362.74476: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855362.74479: Calling groups_plugins_play to load vars for managed_node3 30582 1726855362.76755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855362.78686: done with get_vars() 30582 1726855362.78872: done getting variables 30582 1726855362.78935: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:42 -0400 (0:00:00.087) 0:01:39.141 ****** 30582 1726855362.79124: entering _queue_task() for managed_node3/fail 30582 1726855362.79783: worker is 1 (out of 1 available) 30582 1726855362.79997: exiting _queue_task() for managed_node3/fail 30582 1726855362.80007: done queuing things up, now waiting for results queue to drain 30582 1726855362.80009: waiting for pending results... 30582 1726855362.80408: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855362.80413: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d2f 30582 1726855362.80417: variable 'ansible_search_path' from source: unknown 30582 1726855362.80420: variable 'ansible_search_path' from source: unknown 30582 1726855362.80423: calling self._execute() 30582 1726855362.80426: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855362.80428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855362.80431: variable 'omit' from source: magic vars 30582 1726855362.80836: variable 'ansible_distribution_major_version' from source: facts 30582 1726855362.80846: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855362.81157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855362.84793: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855362.84870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855362.84910: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855362.84943: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855362.84979: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855362.85180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855362.85184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855362.85188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855362.85191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855362.85193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855362.85291: variable 'ansible_distribution_major_version' from source: facts 30582 1726855362.85309: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855362.85432: variable 'ansible_distribution' from source: facts 30582 1726855362.85436: variable '__network_rh_distros' from source: role '' defaults 30582 1726855362.85447: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855362.85723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855362.85749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855362.85772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855362.85812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855362.85832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855362.86042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855362.86046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855362.86049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855362.86051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855362.86053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855362.86055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855362.86058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855362.86060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855362.86181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855362.86184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855362.86734: variable 'network_connections' from source: include params 30582 1726855362.86745: variable 'interface' from source: play vars 30582 1726855362.86927: variable 'interface' from source: play vars 30582 1726855362.86938: variable 'network_state' from source: role '' defaults 30582 1726855362.87004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855362.87443: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855362.87847: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855362.87852: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855362.87855: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855362.87857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855362.87870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855362.87898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855362.87931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855362.88066: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855362.88070: when evaluation is False, skipping this task 30582 1726855362.88073: _execute() done 30582 1726855362.88075: dumping result to json 30582 1726855362.88078: done dumping result, returning 30582 1726855362.88080: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-000000001d2f] 30582 1726855362.88083: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2f skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855362.88247: no more pending results, returning what we have 30582 1726855362.88252: results queue empty 30582 1726855362.88253: checking for any_errors_fatal 30582 1726855362.88259: done checking for any_errors_fatal 30582 1726855362.88260: checking for max_fail_percentage 30582 1726855362.88262: done checking for max_fail_percentage 30582 1726855362.88263: checking to see if all hosts have failed and the running result is not ok 30582 1726855362.88264: done checking to see if all hosts have failed 30582 1726855362.88265: getting the remaining hosts for this loop 30582 1726855362.88266: done getting the remaining hosts for this loop 30582 1726855362.88271: getting the next task for host managed_node3 30582 1726855362.88281: done getting next task for host managed_node3 30582 1726855362.88288: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855362.88294: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855362.88311: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d2f 30582 1726855362.88316: WORKER PROCESS EXITING 30582 1726855362.88337: getting variables 30582 1726855362.88339: in VariableManager get_vars() 30582 1726855362.88391: Calling all_inventory to load vars for managed_node3 30582 1726855362.88394: Calling groups_inventory to load vars for managed_node3 30582 1726855362.88397: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855362.88409: Calling all_plugins_play to load vars for managed_node3 30582 1726855362.88413: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855362.88417: Calling groups_plugins_play to load vars for managed_node3 30582 1726855362.90656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855362.92200: done with get_vars() 30582 1726855362.92227: done getting variables 30582 1726855362.92331: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:42 -0400 (0:00:00.132) 0:01:39.273 ****** 30582 1726855362.92366: entering _queue_task() for managed_node3/dnf 30582 1726855362.93148: worker is 1 (out of 1 available) 30582 1726855362.93160: exiting _queue_task() for managed_node3/dnf 30582 1726855362.93169: done queuing things up, now waiting for results queue to drain 30582 1726855362.93171: waiting for pending results... 30582 1726855362.93557: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855362.93947: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d30 30582 1726855362.93951: variable 'ansible_search_path' from source: unknown 30582 1726855362.93958: variable 'ansible_search_path' from source: unknown 30582 1726855362.94290: calling self._execute() 30582 1726855362.94294: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855362.94299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855362.94319: variable 'omit' from source: magic vars 30582 1726855362.95373: variable 'ansible_distribution_major_version' from source: facts 30582 1726855362.95376: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855362.95661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855362.98668: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855362.98753: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855362.98804: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855362.98851: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855362.98894: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855362.98990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855362.99302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855362.99305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855362.99307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855362.99309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855362.99626: variable 'ansible_distribution' from source: facts 30582 1726855362.99636: variable 'ansible_distribution_major_version' from source: facts 30582 1726855362.99659: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855363.00060: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855363.00214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.00243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.00306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.00430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.00448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.00535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.00692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.00695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.00698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.00720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.00761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.00794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.00824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.00860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.00879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.01159: variable 'network_connections' from source: include params 30582 1726855363.01179: variable 'interface' from source: play vars 30582 1726855363.01319: variable 'interface' from source: play vars 30582 1726855363.01469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855363.01886: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855363.02131: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855363.02134: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855363.02136: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855363.02427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855363.02593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855363.02605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.02608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855363.02611: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855363.03022: variable 'network_connections' from source: include params 30582 1726855363.03095: variable 'interface' from source: play vars 30582 1726855363.03161: variable 'interface' from source: play vars 30582 1726855363.03322: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855363.03348: when evaluation is False, skipping this task 30582 1726855363.03357: _execute() done 30582 1726855363.03367: dumping result to json 30582 1726855363.03375: done dumping result, returning 30582 1726855363.03391: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001d30] 30582 1726855363.03511: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d30 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855363.03689: no more pending results, returning what we have 30582 1726855363.03693: results queue empty 30582 1726855363.03694: checking for any_errors_fatal 30582 1726855363.03702: done checking for any_errors_fatal 30582 1726855363.03702: checking for max_fail_percentage 30582 1726855363.03704: done checking for max_fail_percentage 30582 1726855363.03705: checking to see if all hosts have failed and the running result is not ok 30582 1726855363.03706: done checking to see if all hosts have failed 30582 1726855363.03707: getting the remaining hosts for this loop 30582 1726855363.03708: done getting the remaining hosts for this loop 30582 1726855363.03712: getting the next task for host managed_node3 30582 1726855363.03721: done getting next task for host managed_node3 30582 1726855363.03725: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855363.03731: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855363.03769: getting variables 30582 1726855363.03771: in VariableManager get_vars() 30582 1726855363.03821: Calling all_inventory to load vars for managed_node3 30582 1726855363.03823: Calling groups_inventory to load vars for managed_node3 30582 1726855363.03825: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855363.03839: Calling all_plugins_play to load vars for managed_node3 30582 1726855363.03842: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855363.03847: Calling groups_plugins_play to load vars for managed_node3 30582 1726855363.04680: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d30 30582 1726855363.04684: WORKER PROCESS EXITING 30582 1726855363.07271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855363.11884: done with get_vars() 30582 1726855363.11918: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855363.12348: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:43 -0400 (0:00:00.200) 0:01:39.473 ****** 30582 1726855363.12385: entering _queue_task() for managed_node3/yum 30582 1726855363.13085: worker is 1 (out of 1 available) 30582 1726855363.13100: exiting _queue_task() for managed_node3/yum 30582 1726855363.13113: done queuing things up, now waiting for results queue to drain 30582 1726855363.13115: waiting for pending results... 30582 1726855363.14099: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855363.14606: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d31 30582 1726855363.14609: variable 'ansible_search_path' from source: unknown 30582 1726855363.14613: variable 'ansible_search_path' from source: unknown 30582 1726855363.14794: calling self._execute() 30582 1726855363.15195: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855363.15200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855363.15212: variable 'omit' from source: magic vars 30582 1726855363.16348: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.16360: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855363.16944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855363.22232: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855363.22596: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855363.22600: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855363.22612: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855363.22648: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855363.22845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.22955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.23019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.23265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.23268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.23337: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.23414: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855363.23491: when evaluation is False, skipping this task 30582 1726855363.23501: _execute() done 30582 1726855363.23511: dumping result to json 30582 1726855363.23520: done dumping result, returning 30582 1726855363.23535: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001d31] 30582 1726855363.23547: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d31 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855363.24009: no more pending results, returning what we have 30582 1726855363.24014: results queue empty 30582 1726855363.24016: checking for any_errors_fatal 30582 1726855363.24024: done checking for any_errors_fatal 30582 1726855363.24025: checking for max_fail_percentage 30582 1726855363.24028: done checking for max_fail_percentage 30582 1726855363.24029: checking to see if all hosts have failed and the running result is not ok 30582 1726855363.24029: done checking to see if all hosts have failed 30582 1726855363.24030: getting the remaining hosts for this loop 30582 1726855363.24032: done getting the remaining hosts for this loop 30582 1726855363.24037: getting the next task for host managed_node3 30582 1726855363.24047: done getting next task for host managed_node3 30582 1726855363.24051: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855363.24057: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855363.24093: getting variables 30582 1726855363.24096: in VariableManager get_vars() 30582 1726855363.24145: Calling all_inventory to load vars for managed_node3 30582 1726855363.24149: Calling groups_inventory to load vars for managed_node3 30582 1726855363.24151: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855363.24164: Calling all_plugins_play to load vars for managed_node3 30582 1726855363.24167: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855363.24171: Calling groups_plugins_play to load vars for managed_node3 30582 1726855363.25024: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d31 30582 1726855363.25028: WORKER PROCESS EXITING 30582 1726855363.29286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855363.34123: done with get_vars() 30582 1726855363.34274: done getting variables 30582 1726855363.34341: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:43 -0400 (0:00:00.221) 0:01:39.694 ****** 30582 1726855363.34500: entering _queue_task() for managed_node3/fail 30582 1726855363.35251: worker is 1 (out of 1 available) 30582 1726855363.35267: exiting _queue_task() for managed_node3/fail 30582 1726855363.35279: done queuing things up, now waiting for results queue to drain 30582 1726855363.35281: waiting for pending results... 30582 1726855363.35810: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855363.36103: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d32 30582 1726855363.36111: variable 'ansible_search_path' from source: unknown 30582 1726855363.36115: variable 'ansible_search_path' from source: unknown 30582 1726855363.36274: calling self._execute() 30582 1726855363.36897: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855363.36902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855363.36905: variable 'omit' from source: magic vars 30582 1726855363.37750: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.37886: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855363.38241: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855363.38983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855363.42334: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855363.42338: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855363.42340: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855363.42342: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855363.42344: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855363.42551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.43064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.43093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.43133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.43152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.43205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.43226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.43321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.43325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.43328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.43348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.43376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.43404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.43440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.43455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.43757: variable 'network_connections' from source: include params 30582 1726855363.43761: variable 'interface' from source: play vars 30582 1726855363.43766: variable 'interface' from source: play vars 30582 1726855363.43812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855363.44014: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855363.44017: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855363.44051: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855363.44079: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855363.44224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855363.44227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855363.44230: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.44232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855363.44249: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855363.44653: variable 'network_connections' from source: include params 30582 1726855363.44656: variable 'interface' from source: play vars 30582 1726855363.44658: variable 'interface' from source: play vars 30582 1726855363.44661: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855363.44666: when evaluation is False, skipping this task 30582 1726855363.44668: _execute() done 30582 1726855363.44670: dumping result to json 30582 1726855363.44672: done dumping result, returning 30582 1726855363.44675: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001d32] 30582 1726855363.44677: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d32 30582 1726855363.44754: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d32 30582 1726855363.44758: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855363.44818: no more pending results, returning what we have 30582 1726855363.44823: results queue empty 30582 1726855363.44824: checking for any_errors_fatal 30582 1726855363.44833: done checking for any_errors_fatal 30582 1726855363.44834: checking for max_fail_percentage 30582 1726855363.44836: done checking for max_fail_percentage 30582 1726855363.44838: checking to see if all hosts have failed and the running result is not ok 30582 1726855363.44838: done checking to see if all hosts have failed 30582 1726855363.44839: getting the remaining hosts for this loop 30582 1726855363.44841: done getting the remaining hosts for this loop 30582 1726855363.44845: getting the next task for host managed_node3 30582 1726855363.44855: done getting next task for host managed_node3 30582 1726855363.44860: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855363.44867: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855363.44898: getting variables 30582 1726855363.44900: in VariableManager get_vars() 30582 1726855363.44950: Calling all_inventory to load vars for managed_node3 30582 1726855363.44953: Calling groups_inventory to load vars for managed_node3 30582 1726855363.44956: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855363.44968: Calling all_plugins_play to load vars for managed_node3 30582 1726855363.44971: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855363.44974: Calling groups_plugins_play to load vars for managed_node3 30582 1726855363.46876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855363.49246: done with get_vars() 30582 1726855363.49283: done getting variables 30582 1726855363.49353: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:43 -0400 (0:00:00.148) 0:01:39.843 ****** 30582 1726855363.49397: entering _queue_task() for managed_node3/package 30582 1726855363.49901: worker is 1 (out of 1 available) 30582 1726855363.49912: exiting _queue_task() for managed_node3/package 30582 1726855363.49923: done queuing things up, now waiting for results queue to drain 30582 1726855363.49924: waiting for pending results... 30582 1726855363.50149: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855363.50331: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d33 30582 1726855363.50335: variable 'ansible_search_path' from source: unknown 30582 1726855363.50338: variable 'ansible_search_path' from source: unknown 30582 1726855363.50439: calling self._execute() 30582 1726855363.50469: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855363.50472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855363.50488: variable 'omit' from source: magic vars 30582 1726855363.50936: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.50947: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855363.51396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855363.51518: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855363.51568: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855363.51613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855363.51675: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855363.51798: variable 'network_packages' from source: role '' defaults 30582 1726855363.51969: variable '__network_provider_setup' from source: role '' defaults 30582 1726855363.51973: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855363.51998: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855363.52006: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855363.52076: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855363.52300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855363.55368: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855363.55429: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855363.55470: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855363.55505: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855363.55535: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855363.55621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.55655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.55685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.55975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.55978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.55981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.55983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.55985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.55989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.55992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.56294: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855363.56299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.56315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.56346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.56384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.56404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.56510: variable 'ansible_python' from source: facts 30582 1726855363.56521: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855363.57102: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855363.57105: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855363.57108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.57111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.57113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.57116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.57118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.57211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.57223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.57490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.57524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.57538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.57924: variable 'network_connections' from source: include params 30582 1726855363.57930: variable 'interface' from source: play vars 30582 1726855363.58151: variable 'interface' from source: play vars 30582 1726855363.58402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855363.58520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855363.58523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.58551: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855363.58636: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855363.58968: variable 'network_connections' from source: include params 30582 1726855363.58980: variable 'interface' from source: play vars 30582 1726855363.59072: variable 'interface' from source: play vars 30582 1726855363.59111: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855363.59194: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855363.59485: variable 'network_connections' from source: include params 30582 1726855363.59490: variable 'interface' from source: play vars 30582 1726855363.59554: variable 'interface' from source: play vars 30582 1726855363.59575: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855363.59826: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855363.59952: variable 'network_connections' from source: include params 30582 1726855363.59967: variable 'interface' from source: play vars 30582 1726855363.60023: variable 'interface' from source: play vars 30582 1726855363.60075: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855363.60131: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855363.60137: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855363.60199: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855363.60509: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855363.61427: variable 'network_connections' from source: include params 30582 1726855363.61430: variable 'interface' from source: play vars 30582 1726855363.61499: variable 'interface' from source: play vars 30582 1726855363.61507: variable 'ansible_distribution' from source: facts 30582 1726855363.61510: variable '__network_rh_distros' from source: role '' defaults 30582 1726855363.61518: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.61534: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855363.62002: variable 'ansible_distribution' from source: facts 30582 1726855363.62005: variable '__network_rh_distros' from source: role '' defaults 30582 1726855363.62010: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.62144: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855363.62426: variable 'ansible_distribution' from source: facts 30582 1726855363.62431: variable '__network_rh_distros' from source: role '' defaults 30582 1726855363.62434: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.62590: variable 'network_provider' from source: set_fact 30582 1726855363.62661: variable 'ansible_facts' from source: unknown 30582 1726855363.63622: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855363.63632: when evaluation is False, skipping this task 30582 1726855363.63643: _execute() done 30582 1726855363.63650: dumping result to json 30582 1726855363.63657: done dumping result, returning 30582 1726855363.63896: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000001d33] 30582 1726855363.63900: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d33 30582 1726855363.63979: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d33 30582 1726855363.63983: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855363.64033: no more pending results, returning what we have 30582 1726855363.64036: results queue empty 30582 1726855363.64037: checking for any_errors_fatal 30582 1726855363.64044: done checking for any_errors_fatal 30582 1726855363.64044: checking for max_fail_percentage 30582 1726855363.64046: done checking for max_fail_percentage 30582 1726855363.64047: checking to see if all hosts have failed and the running result is not ok 30582 1726855363.64048: done checking to see if all hosts have failed 30582 1726855363.64048: getting the remaining hosts for this loop 30582 1726855363.64050: done getting the remaining hosts for this loop 30582 1726855363.64054: getting the next task for host managed_node3 30582 1726855363.64061: done getting next task for host managed_node3 30582 1726855363.64065: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855363.64070: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855363.64097: getting variables 30582 1726855363.64099: in VariableManager get_vars() 30582 1726855363.64141: Calling all_inventory to load vars for managed_node3 30582 1726855363.64144: Calling groups_inventory to load vars for managed_node3 30582 1726855363.64145: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855363.64154: Calling all_plugins_play to load vars for managed_node3 30582 1726855363.64156: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855363.64159: Calling groups_plugins_play to load vars for managed_node3 30582 1726855363.66424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855363.68472: done with get_vars() 30582 1726855363.68506: done getting variables 30582 1726855363.68572: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:43 -0400 (0:00:00.192) 0:01:40.035 ****** 30582 1726855363.68607: entering _queue_task() for managed_node3/package 30582 1726855363.69005: worker is 1 (out of 1 available) 30582 1726855363.69019: exiting _queue_task() for managed_node3/package 30582 1726855363.69031: done queuing things up, now waiting for results queue to drain 30582 1726855363.69033: waiting for pending results... 30582 1726855363.69707: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855363.69712: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d34 30582 1726855363.69716: variable 'ansible_search_path' from source: unknown 30582 1726855363.69718: variable 'ansible_search_path' from source: unknown 30582 1726855363.69721: calling self._execute() 30582 1726855363.69724: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855363.69726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855363.69728: variable 'omit' from source: magic vars 30582 1726855363.70063: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.70171: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855363.70195: variable 'network_state' from source: role '' defaults 30582 1726855363.70205: Evaluated conditional (network_state != {}): False 30582 1726855363.70208: when evaluation is False, skipping this task 30582 1726855363.70211: _execute() done 30582 1726855363.70213: dumping result to json 30582 1726855363.70216: done dumping result, returning 30582 1726855363.70224: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001d34] 30582 1726855363.70234: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d34 30582 1726855363.70336: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d34 30582 1726855363.70339: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855363.70383: no more pending results, returning what we have 30582 1726855363.70388: results queue empty 30582 1726855363.70390: checking for any_errors_fatal 30582 1726855363.70396: done checking for any_errors_fatal 30582 1726855363.70397: checking for max_fail_percentage 30582 1726855363.70399: done checking for max_fail_percentage 30582 1726855363.70400: checking to see if all hosts have failed and the running result is not ok 30582 1726855363.70401: done checking to see if all hosts have failed 30582 1726855363.70402: getting the remaining hosts for this loop 30582 1726855363.70403: done getting the remaining hosts for this loop 30582 1726855363.70407: getting the next task for host managed_node3 30582 1726855363.70415: done getting next task for host managed_node3 30582 1726855363.70418: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855363.70425: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855363.70453: getting variables 30582 1726855363.70455: in VariableManager get_vars() 30582 1726855363.70498: Calling all_inventory to load vars for managed_node3 30582 1726855363.70501: Calling groups_inventory to load vars for managed_node3 30582 1726855363.70503: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855363.70513: Calling all_plugins_play to load vars for managed_node3 30582 1726855363.70516: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855363.70518: Calling groups_plugins_play to load vars for managed_node3 30582 1726855363.72510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855363.74014: done with get_vars() 30582 1726855363.74041: done getting variables 30582 1726855363.74097: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:43 -0400 (0:00:00.055) 0:01:40.091 ****** 30582 1726855363.74140: entering _queue_task() for managed_node3/package 30582 1726855363.74732: worker is 1 (out of 1 available) 30582 1726855363.74745: exiting _queue_task() for managed_node3/package 30582 1726855363.74757: done queuing things up, now waiting for results queue to drain 30582 1726855363.74758: waiting for pending results... 30582 1726855363.75408: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855363.75751: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d35 30582 1726855363.75755: variable 'ansible_search_path' from source: unknown 30582 1726855363.75757: variable 'ansible_search_path' from source: unknown 30582 1726855363.75760: calling self._execute() 30582 1726855363.75945: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855363.76001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855363.76012: variable 'omit' from source: magic vars 30582 1726855363.76855: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.76870: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855363.77075: variable 'network_state' from source: role '' defaults 30582 1726855363.77086: Evaluated conditional (network_state != {}): False 30582 1726855363.77091: when evaluation is False, skipping this task 30582 1726855363.77094: _execute() done 30582 1726855363.77097: dumping result to json 30582 1726855363.77100: done dumping result, returning 30582 1726855363.77111: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000001d35] 30582 1726855363.77114: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d35 30582 1726855363.77338: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d35 30582 1726855363.77340: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855363.77394: no more pending results, returning what we have 30582 1726855363.77398: results queue empty 30582 1726855363.77399: checking for any_errors_fatal 30582 1726855363.77408: done checking for any_errors_fatal 30582 1726855363.77408: checking for max_fail_percentage 30582 1726855363.77410: done checking for max_fail_percentage 30582 1726855363.77411: checking to see if all hosts have failed and the running result is not ok 30582 1726855363.77412: done checking to see if all hosts have failed 30582 1726855363.77413: getting the remaining hosts for this loop 30582 1726855363.77415: done getting the remaining hosts for this loop 30582 1726855363.77419: getting the next task for host managed_node3 30582 1726855363.77427: done getting next task for host managed_node3 30582 1726855363.77431: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855363.77438: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855363.77470: getting variables 30582 1726855363.77472: in VariableManager get_vars() 30582 1726855363.77518: Calling all_inventory to load vars for managed_node3 30582 1726855363.77521: Calling groups_inventory to load vars for managed_node3 30582 1726855363.77523: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855363.77534: Calling all_plugins_play to load vars for managed_node3 30582 1726855363.77537: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855363.77540: Calling groups_plugins_play to load vars for managed_node3 30582 1726855363.79990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855363.81778: done with get_vars() 30582 1726855363.81804: done getting variables 30582 1726855363.81867: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:43 -0400 (0:00:00.077) 0:01:40.169 ****** 30582 1726855363.81914: entering _queue_task() for managed_node3/service 30582 1726855363.82681: worker is 1 (out of 1 available) 30582 1726855363.82697: exiting _queue_task() for managed_node3/service 30582 1726855363.82710: done queuing things up, now waiting for results queue to drain 30582 1726855363.82712: waiting for pending results... 30582 1726855363.83457: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855363.83662: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d36 30582 1726855363.83666: variable 'ansible_search_path' from source: unknown 30582 1726855363.83669: variable 'ansible_search_path' from source: unknown 30582 1726855363.83823: calling self._execute() 30582 1726855363.83936: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855363.83940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855363.83950: variable 'omit' from source: magic vars 30582 1726855363.85093: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.85096: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855363.85104: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855363.85391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855363.89475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855363.89647: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855363.89686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855363.89734: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855363.89760: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855363.90027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.90073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.90139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.90178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.90246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.90262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.90306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.90344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.90537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.90540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.90543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855363.90545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855363.90548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.90603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855363.90878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855363.91058: variable 'network_connections' from source: include params 30582 1726855363.91073: variable 'interface' from source: play vars 30582 1726855363.91151: variable 'interface' from source: play vars 30582 1726855363.91230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855363.91404: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855363.91492: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855363.91498: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855363.91504: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855363.91542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855363.91567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855363.91594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855363.91619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855363.91728: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855363.92053: variable 'network_connections' from source: include params 30582 1726855363.92056: variable 'interface' from source: play vars 30582 1726855363.92102: variable 'interface' from source: play vars 30582 1726855363.92133: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855363.92142: when evaluation is False, skipping this task 30582 1726855363.92150: _execute() done 30582 1726855363.92162: dumping result to json 30582 1726855363.92379: done dumping result, returning 30582 1726855363.92382: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000001d36] 30582 1726855363.92385: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d36 30582 1726855363.92463: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d36 30582 1726855363.92473: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855363.92639: no more pending results, returning what we have 30582 1726855363.92643: results queue empty 30582 1726855363.92644: checking for any_errors_fatal 30582 1726855363.92653: done checking for any_errors_fatal 30582 1726855363.92654: checking for max_fail_percentage 30582 1726855363.92657: done checking for max_fail_percentage 30582 1726855363.92658: checking to see if all hosts have failed and the running result is not ok 30582 1726855363.92659: done checking to see if all hosts have failed 30582 1726855363.92660: getting the remaining hosts for this loop 30582 1726855363.92662: done getting the remaining hosts for this loop 30582 1726855363.92666: getting the next task for host managed_node3 30582 1726855363.92675: done getting next task for host managed_node3 30582 1726855363.92681: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855363.92686: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855363.92715: getting variables 30582 1726855363.92717: in VariableManager get_vars() 30582 1726855363.92763: Calling all_inventory to load vars for managed_node3 30582 1726855363.92767: Calling groups_inventory to load vars for managed_node3 30582 1726855363.92770: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855363.92781: Calling all_plugins_play to load vars for managed_node3 30582 1726855363.92785: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855363.92860: Calling groups_plugins_play to load vars for managed_node3 30582 1726855363.94573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855363.96304: done with get_vars() 30582 1726855363.96334: done getting variables 30582 1726855363.96392: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:43 -0400 (0:00:00.145) 0:01:40.314 ****** 30582 1726855363.96426: entering _queue_task() for managed_node3/service 30582 1726855363.96789: worker is 1 (out of 1 available) 30582 1726855363.96803: exiting _queue_task() for managed_node3/service 30582 1726855363.96815: done queuing things up, now waiting for results queue to drain 30582 1726855363.96816: waiting for pending results... 30582 1726855363.97206: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855363.97240: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d37 30582 1726855363.97262: variable 'ansible_search_path' from source: unknown 30582 1726855363.97271: variable 'ansible_search_path' from source: unknown 30582 1726855363.97321: calling self._execute() 30582 1726855363.97438: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855363.97450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855363.97466: variable 'omit' from source: magic vars 30582 1726855363.97873: variable 'ansible_distribution_major_version' from source: facts 30582 1726855363.97894: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855363.98065: variable 'network_provider' from source: set_fact 30582 1726855363.98078: variable 'network_state' from source: role '' defaults 30582 1726855363.98096: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855363.98108: variable 'omit' from source: magic vars 30582 1726855363.98185: variable 'omit' from source: magic vars 30582 1726855363.98282: variable 'network_service_name' from source: role '' defaults 30582 1726855363.98297: variable 'network_service_name' from source: role '' defaults 30582 1726855363.98407: variable '__network_provider_setup' from source: role '' defaults 30582 1726855363.98417: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855363.98481: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855363.98501: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855363.98564: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855363.98800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855364.00981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855364.01395: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855364.01399: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855364.01401: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855364.01403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855364.01628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855364.01665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855364.01698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.01756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855364.01845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855364.01941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855364.01968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855364.02070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.02114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855364.02168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855364.02634: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855364.02924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855364.02954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855364.03045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.03093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855364.03143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855364.03332: variable 'ansible_python' from source: facts 30582 1726855364.03559: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855364.03562: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855364.03747: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855364.04184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855364.04189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855364.04192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.04194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855364.04196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855364.04341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855364.04418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855364.04441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.04530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855364.04547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855364.04995: variable 'network_connections' from source: include params 30582 1726855364.04998: variable 'interface' from source: play vars 30582 1726855364.05001: variable 'interface' from source: play vars 30582 1726855364.05166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855364.05361: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855364.05422: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855364.05475: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855364.05528: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855364.06040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855364.06077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855364.06124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.06164: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855364.06224: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855364.06527: variable 'network_connections' from source: include params 30582 1726855364.06539: variable 'interface' from source: play vars 30582 1726855364.06616: variable 'interface' from source: play vars 30582 1726855364.06660: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855364.06749: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855364.07054: variable 'network_connections' from source: include params 30582 1726855364.07070: variable 'interface' from source: play vars 30582 1726855364.07138: variable 'interface' from source: play vars 30582 1726855364.07161: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855364.07240: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855364.07522: variable 'network_connections' from source: include params 30582 1726855364.07532: variable 'interface' from source: play vars 30582 1726855364.07608: variable 'interface' from source: play vars 30582 1726855364.07663: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855364.07795: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855364.07798: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855364.07802: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855364.08017: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855364.08514: variable 'network_connections' from source: include params 30582 1726855364.08525: variable 'interface' from source: play vars 30582 1726855364.08594: variable 'interface' from source: play vars 30582 1726855364.08607: variable 'ansible_distribution' from source: facts 30582 1726855364.08615: variable '__network_rh_distros' from source: role '' defaults 30582 1726855364.08627: variable 'ansible_distribution_major_version' from source: facts 30582 1726855364.08646: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855364.08830: variable 'ansible_distribution' from source: facts 30582 1726855364.08844: variable '__network_rh_distros' from source: role '' defaults 30582 1726855364.08914: variable 'ansible_distribution_major_version' from source: facts 30582 1726855364.08917: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855364.09054: variable 'ansible_distribution' from source: facts 30582 1726855364.09064: variable '__network_rh_distros' from source: role '' defaults 30582 1726855364.09075: variable 'ansible_distribution_major_version' from source: facts 30582 1726855364.09117: variable 'network_provider' from source: set_fact 30582 1726855364.09151: variable 'omit' from source: magic vars 30582 1726855364.09185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855364.09220: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855364.09249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855364.09348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855364.09351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855364.09354: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855364.09356: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855364.09358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855364.09443: Set connection var ansible_timeout to 10 30582 1726855364.09455: Set connection var ansible_connection to ssh 30582 1726855364.09473: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855364.09483: Set connection var ansible_pipelining to False 30582 1726855364.09495: Set connection var ansible_shell_executable to /bin/sh 30582 1726855364.09503: Set connection var ansible_shell_type to sh 30582 1726855364.09531: variable 'ansible_shell_executable' from source: unknown 30582 1726855364.09539: variable 'ansible_connection' from source: unknown 30582 1726855364.09547: variable 'ansible_module_compression' from source: unknown 30582 1726855364.09554: variable 'ansible_shell_type' from source: unknown 30582 1726855364.09561: variable 'ansible_shell_executable' from source: unknown 30582 1726855364.09572: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855364.09581: variable 'ansible_pipelining' from source: unknown 30582 1726855364.09680: variable 'ansible_timeout' from source: unknown 30582 1726855364.09683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855364.09714: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855364.09735: variable 'omit' from source: magic vars 30582 1726855364.09744: starting attempt loop 30582 1726855364.09749: running the handler 30582 1726855364.09823: variable 'ansible_facts' from source: unknown 30582 1726855364.10563: _low_level_execute_command(): starting 30582 1726855364.10576: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855364.11277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855364.11316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855364.11329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855364.11427: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855364.11450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855364.11469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855364.11575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855364.13322: stdout chunk (state=3): >>>/root <<< 30582 1726855364.13455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855364.13485: stdout chunk (state=3): >>><<< 30582 1726855364.13500: stderr chunk (state=3): >>><<< 30582 1726855364.13524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855364.13543: _low_level_execute_command(): starting 30582 1726855364.13629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447 `" && echo ansible-tmp-1726855364.1353185-35252-167639349183447="` echo /root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447 `" ) && sleep 0' 30582 1726855364.14230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855364.14236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855364.14322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855364.14326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855364.14329: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855364.14331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855364.14354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855364.14359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855364.14380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855364.14468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855364.16481: stdout chunk (state=3): >>>ansible-tmp-1726855364.1353185-35252-167639349183447=/root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447 <<< 30582 1726855364.16622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855364.16625: stdout chunk (state=3): >>><<< 30582 1726855364.16628: stderr chunk (state=3): >>><<< 30582 1726855364.16792: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855364.1353185-35252-167639349183447=/root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855364.16796: variable 'ansible_module_compression' from source: unknown 30582 1726855364.16798: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855364.16813: variable 'ansible_facts' from source: unknown 30582 1726855364.17036: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/AnsiballZ_systemd.py 30582 1726855364.17266: Sending initial data 30582 1726855364.17270: Sent initial data (156 bytes) 30582 1726855364.18030: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855364.18107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855364.18161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855364.18183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855364.18217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855364.18340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855364.20032: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855364.20109: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855364.20175: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpde49i3dj /root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/AnsiballZ_systemd.py <<< 30582 1726855364.20179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/AnsiballZ_systemd.py" <<< 30582 1726855364.20295: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpde49i3dj" to remote "/root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/AnsiballZ_systemd.py" <<< 30582 1726855364.22186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855364.22234: stderr chunk (state=3): >>><<< 30582 1726855364.22267: stdout chunk (state=3): >>><<< 30582 1726855364.22393: done transferring module to remote 30582 1726855364.22397: _low_level_execute_command(): starting 30582 1726855364.22399: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/ /root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/AnsiballZ_systemd.py && sleep 0' 30582 1726855364.23265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855364.23268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855364.23270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855364.23272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855364.23274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855364.23276: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855364.23328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855364.23379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855364.23427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855364.25354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855364.25368: stdout chunk (state=3): >>><<< 30582 1726855364.25390: stderr chunk (state=3): >>><<< 30582 1726855364.25489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855364.25493: _low_level_execute_command(): starting 30582 1726855364.25496: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/AnsiballZ_systemd.py && sleep 0' 30582 1726855364.26140: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855364.26155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855364.26172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855364.26198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855364.26216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855364.26353: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855364.26471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855364.26496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855364.26596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855364.56363: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10670080", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321262080", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2245043000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855364.56406: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855364.58517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855364.58554: stdout chunk (state=3): >>><<< 30582 1726855364.58569: stderr chunk (state=3): >>><<< 30582 1726855364.58796: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10670080", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321262080", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2245043000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855364.59040: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855364.59068: _low_level_execute_command(): starting 30582 1726855364.59265: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855364.1353185-35252-167639349183447/ > /dev/null 2>&1 && sleep 0' 30582 1726855364.60253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855364.60266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855364.60278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855364.60292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855364.60345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855364.60361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855364.60384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855364.60486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855364.62566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855364.62577: stdout chunk (state=3): >>><<< 30582 1726855364.62591: stderr chunk (state=3): >>><<< 30582 1726855364.62610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855364.62902: handler run complete 30582 1726855364.62905: attempt loop complete, returning result 30582 1726855364.62907: _execute() done 30582 1726855364.62909: dumping result to json 30582 1726855364.62911: done dumping result, returning 30582 1726855364.62912: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-000000001d37] 30582 1726855364.62914: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d37 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855364.63152: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d37 30582 1726855364.63178: no more pending results, returning what we have 30582 1726855364.63183: results queue empty 30582 1726855364.63184: checking for any_errors_fatal 30582 1726855364.63196: done checking for any_errors_fatal 30582 1726855364.63197: checking for max_fail_percentage 30582 1726855364.63200: done checking for max_fail_percentage 30582 1726855364.63201: checking to see if all hosts have failed and the running result is not ok 30582 1726855364.63202: done checking to see if all hosts have failed 30582 1726855364.63203: getting the remaining hosts for this loop 30582 1726855364.63204: done getting the remaining hosts for this loop 30582 1726855364.63209: getting the next task for host managed_node3 30582 1726855364.63217: done getting next task for host managed_node3 30582 1726855364.63221: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855364.63227: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855364.63243: getting variables 30582 1726855364.63245: in VariableManager get_vars() 30582 1726855364.63444: Calling all_inventory to load vars for managed_node3 30582 1726855364.63447: Calling groups_inventory to load vars for managed_node3 30582 1726855364.63451: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855364.63467: Calling all_plugins_play to load vars for managed_node3 30582 1726855364.63471: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855364.63475: Calling groups_plugins_play to load vars for managed_node3 30582 1726855364.64001: WORKER PROCESS EXITING 30582 1726855364.65378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855364.67090: done with get_vars() 30582 1726855364.67119: done getting variables 30582 1726855364.67178: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:44 -0400 (0:00:00.707) 0:01:41.022 ****** 30582 1726855364.67226: entering _queue_task() for managed_node3/service 30582 1726855364.67638: worker is 1 (out of 1 available) 30582 1726855364.67658: exiting _queue_task() for managed_node3/service 30582 1726855364.67670: done queuing things up, now waiting for results queue to drain 30582 1726855364.67672: waiting for pending results... 30582 1726855364.67905: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855364.68095: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d38 30582 1726855364.68099: variable 'ansible_search_path' from source: unknown 30582 1726855364.68102: variable 'ansible_search_path' from source: unknown 30582 1726855364.68105: calling self._execute() 30582 1726855364.68165: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855364.68174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855364.68220: variable 'omit' from source: magic vars 30582 1726855364.68558: variable 'ansible_distribution_major_version' from source: facts 30582 1726855364.68573: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855364.68691: variable 'network_provider' from source: set_fact 30582 1726855364.68698: Evaluated conditional (network_provider == "nm"): True 30582 1726855364.68993: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855364.68996: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855364.69045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855364.71082: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855364.71148: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855364.71194: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855364.71222: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855364.71248: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855364.71340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855364.71371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855364.71400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.71440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855364.71451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855364.71502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855364.71523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855364.71541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.71565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855364.71578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855364.71611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855364.71626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855364.71643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.71676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855364.71686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855364.71791: variable 'network_connections' from source: include params 30582 1726855364.71800: variable 'interface' from source: play vars 30582 1726855364.71848: variable 'interface' from source: play vars 30582 1726855364.71902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855364.72013: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855364.72040: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855364.72061: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855364.72088: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855364.72118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855364.72133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855364.72151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.72171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855364.72210: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855364.72357: variable 'network_connections' from source: include params 30582 1726855364.72361: variable 'interface' from source: play vars 30582 1726855364.72408: variable 'interface' from source: play vars 30582 1726855364.72430: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855364.72433: when evaluation is False, skipping this task 30582 1726855364.72436: _execute() done 30582 1726855364.72438: dumping result to json 30582 1726855364.72440: done dumping result, returning 30582 1726855364.72448: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-000000001d38] 30582 1726855364.72459: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d38 30582 1726855364.72545: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d38 30582 1726855364.72548: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855364.72597: no more pending results, returning what we have 30582 1726855364.72600: results queue empty 30582 1726855364.72601: checking for any_errors_fatal 30582 1726855364.72617: done checking for any_errors_fatal 30582 1726855364.72617: checking for max_fail_percentage 30582 1726855364.72619: done checking for max_fail_percentage 30582 1726855364.72620: checking to see if all hosts have failed and the running result is not ok 30582 1726855364.72621: done checking to see if all hosts have failed 30582 1726855364.72622: getting the remaining hosts for this loop 30582 1726855364.72623: done getting the remaining hosts for this loop 30582 1726855364.72627: getting the next task for host managed_node3 30582 1726855364.72635: done getting next task for host managed_node3 30582 1726855364.72639: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855364.72643: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855364.72674: getting variables 30582 1726855364.72676: in VariableManager get_vars() 30582 1726855364.72720: Calling all_inventory to load vars for managed_node3 30582 1726855364.72723: Calling groups_inventory to load vars for managed_node3 30582 1726855364.72725: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855364.72734: Calling all_plugins_play to load vars for managed_node3 30582 1726855364.72736: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855364.72739: Calling groups_plugins_play to load vars for managed_node3 30582 1726855364.73805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855364.74933: done with get_vars() 30582 1726855364.74951: done getting variables 30582 1726855364.74996: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:44 -0400 (0:00:00.077) 0:01:41.100 ****** 30582 1726855364.75020: entering _queue_task() for managed_node3/service 30582 1726855364.75269: worker is 1 (out of 1 available) 30582 1726855364.75282: exiting _queue_task() for managed_node3/service 30582 1726855364.75296: done queuing things up, now waiting for results queue to drain 30582 1726855364.75297: waiting for pending results... 30582 1726855364.75513: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855364.75718: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d39 30582 1726855364.75723: variable 'ansible_search_path' from source: unknown 30582 1726855364.75725: variable 'ansible_search_path' from source: unknown 30582 1726855364.75729: calling self._execute() 30582 1726855364.75835: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855364.75853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855364.75868: variable 'omit' from source: magic vars 30582 1726855364.76278: variable 'ansible_distribution_major_version' from source: facts 30582 1726855364.76296: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855364.76480: variable 'network_provider' from source: set_fact 30582 1726855364.76484: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855364.76486: when evaluation is False, skipping this task 30582 1726855364.76492: _execute() done 30582 1726855364.76494: dumping result to json 30582 1726855364.76497: done dumping result, returning 30582 1726855364.76501: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-000000001d39] 30582 1726855364.76503: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d39 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855364.76749: no more pending results, returning what we have 30582 1726855364.76753: results queue empty 30582 1726855364.76754: checking for any_errors_fatal 30582 1726855364.76764: done checking for any_errors_fatal 30582 1726855364.76764: checking for max_fail_percentage 30582 1726855364.76767: done checking for max_fail_percentage 30582 1726855364.76768: checking to see if all hosts have failed and the running result is not ok 30582 1726855364.76769: done checking to see if all hosts have failed 30582 1726855364.76769: getting the remaining hosts for this loop 30582 1726855364.76771: done getting the remaining hosts for this loop 30582 1726855364.76775: getting the next task for host managed_node3 30582 1726855364.76784: done getting next task for host managed_node3 30582 1726855364.76792: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855364.76996: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855364.77018: getting variables 30582 1726855364.77021: in VariableManager get_vars() 30582 1726855364.77059: Calling all_inventory to load vars for managed_node3 30582 1726855364.77062: Calling groups_inventory to load vars for managed_node3 30582 1726855364.77064: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855364.77073: Calling all_plugins_play to load vars for managed_node3 30582 1726855364.77076: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855364.77080: Calling groups_plugins_play to load vars for managed_node3 30582 1726855364.77602: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d39 30582 1726855364.77606: WORKER PROCESS EXITING 30582 1726855364.78400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855364.79391: done with get_vars() 30582 1726855364.79413: done getting variables 30582 1726855364.79480: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:44 -0400 (0:00:00.044) 0:01:41.145 ****** 30582 1726855364.79523: entering _queue_task() for managed_node3/copy 30582 1726855364.80023: worker is 1 (out of 1 available) 30582 1726855364.80033: exiting _queue_task() for managed_node3/copy 30582 1726855364.80043: done queuing things up, now waiting for results queue to drain 30582 1726855364.80045: waiting for pending results... 30582 1726855364.80169: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855364.80315: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d3a 30582 1726855364.80335: variable 'ansible_search_path' from source: unknown 30582 1726855364.80349: variable 'ansible_search_path' from source: unknown 30582 1726855364.80395: calling self._execute() 30582 1726855364.80504: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855364.80515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855364.80544: variable 'omit' from source: magic vars 30582 1726855364.80954: variable 'ansible_distribution_major_version' from source: facts 30582 1726855364.80957: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855364.80973: variable 'network_provider' from source: set_fact 30582 1726855364.80980: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855364.80983: when evaluation is False, skipping this task 30582 1726855364.80985: _execute() done 30582 1726855364.80989: dumping result to json 30582 1726855364.80992: done dumping result, returning 30582 1726855364.81001: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-000000001d3a] 30582 1726855364.81004: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3a skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855364.81170: no more pending results, returning what we have 30582 1726855364.81174: results queue empty 30582 1726855364.81176: checking for any_errors_fatal 30582 1726855364.81182: done checking for any_errors_fatal 30582 1726855364.81182: checking for max_fail_percentage 30582 1726855364.81184: done checking for max_fail_percentage 30582 1726855364.81185: checking to see if all hosts have failed and the running result is not ok 30582 1726855364.81185: done checking to see if all hosts have failed 30582 1726855364.81186: getting the remaining hosts for this loop 30582 1726855364.81189: done getting the remaining hosts for this loop 30582 1726855364.81193: getting the next task for host managed_node3 30582 1726855364.81199: done getting next task for host managed_node3 30582 1726855364.81203: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855364.81207: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855364.81227: getting variables 30582 1726855364.81229: in VariableManager get_vars() 30582 1726855364.81262: Calling all_inventory to load vars for managed_node3 30582 1726855364.81264: Calling groups_inventory to load vars for managed_node3 30582 1726855364.81266: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855364.81274: Calling all_plugins_play to load vars for managed_node3 30582 1726855364.81276: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855364.81279: Calling groups_plugins_play to load vars for managed_node3 30582 1726855364.81803: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3a 30582 1726855364.81807: WORKER PROCESS EXITING 30582 1726855364.82438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855364.83426: done with get_vars() 30582 1726855364.83441: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:44 -0400 (0:00:00.039) 0:01:41.184 ****** 30582 1726855364.83507: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855364.83730: worker is 1 (out of 1 available) 30582 1726855364.83744: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855364.83755: done queuing things up, now waiting for results queue to drain 30582 1726855364.83757: waiting for pending results... 30582 1726855364.83940: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855364.84070: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d3b 30582 1726855364.84074: variable 'ansible_search_path' from source: unknown 30582 1726855364.84076: variable 'ansible_search_path' from source: unknown 30582 1726855364.84110: calling self._execute() 30582 1726855364.84411: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855364.84415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855364.84418: variable 'omit' from source: magic vars 30582 1726855364.84600: variable 'ansible_distribution_major_version' from source: facts 30582 1726855364.84612: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855364.84621: variable 'omit' from source: magic vars 30582 1726855364.84684: variable 'omit' from source: magic vars 30582 1726855364.84834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855364.91598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855364.91644: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855364.91672: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855364.91697: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855364.91716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855364.91770: variable 'network_provider' from source: set_fact 30582 1726855364.91856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855364.91878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855364.91900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855364.91925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855364.91935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855364.91991: variable 'omit' from source: magic vars 30582 1726855364.92061: variable 'omit' from source: magic vars 30582 1726855364.92134: variable 'network_connections' from source: include params 30582 1726855364.92143: variable 'interface' from source: play vars 30582 1726855364.92186: variable 'interface' from source: play vars 30582 1726855364.92278: variable 'omit' from source: magic vars 30582 1726855364.92284: variable '__lsr_ansible_managed' from source: task vars 30582 1726855364.92328: variable '__lsr_ansible_managed' from source: task vars 30582 1726855364.92441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855364.92565: Loaded config def from plugin (lookup/template) 30582 1726855364.92571: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855364.92591: File lookup term: get_ansible_managed.j2 30582 1726855364.92594: variable 'ansible_search_path' from source: unknown 30582 1726855364.92597: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855364.92608: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855364.92620: variable 'ansible_search_path' from source: unknown 30582 1726855364.95721: variable 'ansible_managed' from source: unknown 30582 1726855364.95795: variable 'omit' from source: magic vars 30582 1726855364.95814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855364.95829: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855364.95839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855364.95850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855364.95857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855364.95875: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855364.95878: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855364.95880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855364.95941: Set connection var ansible_timeout to 10 30582 1726855364.95944: Set connection var ansible_connection to ssh 30582 1726855364.95949: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855364.95954: Set connection var ansible_pipelining to False 30582 1726855364.95959: Set connection var ansible_shell_executable to /bin/sh 30582 1726855364.95961: Set connection var ansible_shell_type to sh 30582 1726855364.95980: variable 'ansible_shell_executable' from source: unknown 30582 1726855364.95982: variable 'ansible_connection' from source: unknown 30582 1726855364.95985: variable 'ansible_module_compression' from source: unknown 30582 1726855364.95989: variable 'ansible_shell_type' from source: unknown 30582 1726855364.95992: variable 'ansible_shell_executable' from source: unknown 30582 1726855364.95996: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855364.95999: variable 'ansible_pipelining' from source: unknown 30582 1726855364.96001: variable 'ansible_timeout' from source: unknown 30582 1726855364.96003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855364.96079: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855364.96092: variable 'omit' from source: magic vars 30582 1726855364.96095: starting attempt loop 30582 1726855364.96098: running the handler 30582 1726855364.96105: _low_level_execute_command(): starting 30582 1726855364.96110: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855364.96574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855364.96601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855364.96605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855364.96644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855364.96659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855364.96739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855364.98496: stdout chunk (state=3): >>>/root <<< 30582 1726855364.98601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855364.98631: stderr chunk (state=3): >>><<< 30582 1726855364.98635: stdout chunk (state=3): >>><<< 30582 1726855364.98651: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855364.98662: _low_level_execute_command(): starting 30582 1726855364.98668: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706 `" && echo ansible-tmp-1726855364.9865177-35332-155618300653706="` echo /root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706 `" ) && sleep 0' 30582 1726855364.99096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855364.99122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855364.99126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855364.99129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855364.99131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855364.99181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855364.99184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855364.99251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855365.01205: stdout chunk (state=3): >>>ansible-tmp-1726855364.9865177-35332-155618300653706=/root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706 <<< 30582 1726855365.01316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855365.01340: stderr chunk (state=3): >>><<< 30582 1726855365.01343: stdout chunk (state=3): >>><<< 30582 1726855365.01359: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855364.9865177-35332-155618300653706=/root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855365.01396: variable 'ansible_module_compression' from source: unknown 30582 1726855365.01426: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855365.01447: variable 'ansible_facts' from source: unknown 30582 1726855365.01515: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/AnsiballZ_network_connections.py 30582 1726855365.01611: Sending initial data 30582 1726855365.01614: Sent initial data (168 bytes) 30582 1726855365.02033: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855365.02038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855365.02045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.02047: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855365.02049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.02093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855365.02097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855365.02167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855365.03790: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30582 1726855365.03795: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855365.03847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855365.03907: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmphfcjobcj /root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/AnsiballZ_network_connections.py <<< 30582 1726855365.03912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/AnsiballZ_network_connections.py" <<< 30582 1726855365.03964: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmphfcjobcj" to remote "/root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/AnsiballZ_network_connections.py" <<< 30582 1726855365.03969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/AnsiballZ_network_connections.py" <<< 30582 1726855365.04832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855365.04841: stderr chunk (state=3): >>><<< 30582 1726855365.04844: stdout chunk (state=3): >>><<< 30582 1726855365.04871: done transferring module to remote 30582 1726855365.04879: _low_level_execute_command(): starting 30582 1726855365.04884: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/ /root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/AnsiballZ_network_connections.py && sleep 0' 30582 1726855365.05493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855365.05498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.05500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855365.05502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.05576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855365.05637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855365.07486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855365.07513: stderr chunk (state=3): >>><<< 30582 1726855365.07516: stdout chunk (state=3): >>><<< 30582 1726855365.07528: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855365.07531: _low_level_execute_command(): starting 30582 1726855365.07535: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/AnsiballZ_network_connections.py && sleep 0' 30582 1726855365.07952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855365.07959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855365.07985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.07990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855365.07993: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855365.07995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.08043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855365.08051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855365.08116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855365.41207: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855365.43198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855365.43203: stdout chunk (state=3): >>><<< 30582 1726855365.43210: stderr chunk (state=3): >>><<< 30582 1726855365.43229: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855365.43268: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855365.43280: _low_level_execute_command(): starting 30582 1726855365.43285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855364.9865177-35332-155618300653706/ > /dev/null 2>&1 && sleep 0' 30582 1726855365.43922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855365.43931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855365.43993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855365.43996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855365.44000: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.44067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855365.44084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855365.44106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855365.44204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855365.46198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855365.46202: stdout chunk (state=3): >>><<< 30582 1726855365.46204: stderr chunk (state=3): >>><<< 30582 1726855365.46207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855365.46209: handler run complete 30582 1726855365.46211: attempt loop complete, returning result 30582 1726855365.46213: _execute() done 30582 1726855365.46215: dumping result to json 30582 1726855365.46216: done dumping result, returning 30582 1726855365.46218: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-000000001d3b] 30582 1726855365.46220: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3b changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30582 1726855365.46440: no more pending results, returning what we have 30582 1726855365.46444: results queue empty 30582 1726855365.46445: checking for any_errors_fatal 30582 1726855365.46452: done checking for any_errors_fatal 30582 1726855365.46453: checking for max_fail_percentage 30582 1726855365.46455: done checking for max_fail_percentage 30582 1726855365.46456: checking to see if all hosts have failed and the running result is not ok 30582 1726855365.46457: done checking to see if all hosts have failed 30582 1726855365.46458: getting the remaining hosts for this loop 30582 1726855365.46459: done getting the remaining hosts for this loop 30582 1726855365.46462: getting the next task for host managed_node3 30582 1726855365.46470: done getting next task for host managed_node3 30582 1726855365.46474: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855365.46479: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855365.46713: getting variables 30582 1726855365.46716: in VariableManager get_vars() 30582 1726855365.46756: Calling all_inventory to load vars for managed_node3 30582 1726855365.46759: Calling groups_inventory to load vars for managed_node3 30582 1726855365.46762: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855365.46770: Calling all_plugins_play to load vars for managed_node3 30582 1726855365.46773: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855365.46776: Calling groups_plugins_play to load vars for managed_node3 30582 1726855365.47383: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3b 30582 1726855365.47386: WORKER PROCESS EXITING 30582 1726855365.54226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855365.55912: done with get_vars() 30582 1726855365.55952: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:45 -0400 (0:00:00.725) 0:01:41.910 ****** 30582 1726855365.56042: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855365.56494: worker is 1 (out of 1 available) 30582 1726855365.56508: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855365.56521: done queuing things up, now waiting for results queue to drain 30582 1726855365.56522: waiting for pending results... 30582 1726855365.56804: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855365.56984: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d3c 30582 1726855365.57009: variable 'ansible_search_path' from source: unknown 30582 1726855365.57018: variable 'ansible_search_path' from source: unknown 30582 1726855365.57065: calling self._execute() 30582 1726855365.57163: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.57176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.57191: variable 'omit' from source: magic vars 30582 1726855365.57579: variable 'ansible_distribution_major_version' from source: facts 30582 1726855365.57685: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855365.57718: variable 'network_state' from source: role '' defaults 30582 1726855365.57736: Evaluated conditional (network_state != {}): False 30582 1726855365.57743: when evaluation is False, skipping this task 30582 1726855365.57750: _execute() done 30582 1726855365.57757: dumping result to json 30582 1726855365.57764: done dumping result, returning 30582 1726855365.57776: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-000000001d3c] 30582 1726855365.57790: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855365.57950: no more pending results, returning what we have 30582 1726855365.57954: results queue empty 30582 1726855365.57955: checking for any_errors_fatal 30582 1726855365.57973: done checking for any_errors_fatal 30582 1726855365.57974: checking for max_fail_percentage 30582 1726855365.57976: done checking for max_fail_percentage 30582 1726855365.57977: checking to see if all hosts have failed and the running result is not ok 30582 1726855365.57978: done checking to see if all hosts have failed 30582 1726855365.57978: getting the remaining hosts for this loop 30582 1726855365.57980: done getting the remaining hosts for this loop 30582 1726855365.57984: getting the next task for host managed_node3 30582 1726855365.57993: done getting next task for host managed_node3 30582 1726855365.57997: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855365.58004: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855365.58039: getting variables 30582 1726855365.58041: in VariableManager get_vars() 30582 1726855365.58085: Calling all_inventory to load vars for managed_node3 30582 1726855365.58275: Calling groups_inventory to load vars for managed_node3 30582 1726855365.58279: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855365.58285: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3c 30582 1726855365.58291: WORKER PROCESS EXITING 30582 1726855365.58300: Calling all_plugins_play to load vars for managed_node3 30582 1726855365.58303: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855365.58307: Calling groups_plugins_play to load vars for managed_node3 30582 1726855365.59766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855365.61408: done with get_vars() 30582 1726855365.61434: done getting variables 30582 1726855365.61508: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:45 -0400 (0:00:00.055) 0:01:41.965 ****** 30582 1726855365.61546: entering _queue_task() for managed_node3/debug 30582 1726855365.61994: worker is 1 (out of 1 available) 30582 1726855365.62009: exiting _queue_task() for managed_node3/debug 30582 1726855365.62021: done queuing things up, now waiting for results queue to drain 30582 1726855365.62022: waiting for pending results... 30582 1726855365.62410: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855365.62445: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d3d 30582 1726855365.62465: variable 'ansible_search_path' from source: unknown 30582 1726855365.62473: variable 'ansible_search_path' from source: unknown 30582 1726855365.62523: calling self._execute() 30582 1726855365.62634: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.62647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.62662: variable 'omit' from source: magic vars 30582 1726855365.63071: variable 'ansible_distribution_major_version' from source: facts 30582 1726855365.63090: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855365.63101: variable 'omit' from source: magic vars 30582 1726855365.63175: variable 'omit' from source: magic vars 30582 1726855365.63216: variable 'omit' from source: magic vars 30582 1726855365.63257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855365.63304: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855365.63328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855365.63348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855365.63363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855365.63405: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855365.63414: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.63486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.63537: Set connection var ansible_timeout to 10 30582 1726855365.63544: Set connection var ansible_connection to ssh 30582 1726855365.63556: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855365.63564: Set connection var ansible_pipelining to False 30582 1726855365.63573: Set connection var ansible_shell_executable to /bin/sh 30582 1726855365.63580: Set connection var ansible_shell_type to sh 30582 1726855365.63613: variable 'ansible_shell_executable' from source: unknown 30582 1726855365.63621: variable 'ansible_connection' from source: unknown 30582 1726855365.63628: variable 'ansible_module_compression' from source: unknown 30582 1726855365.63635: variable 'ansible_shell_type' from source: unknown 30582 1726855365.63641: variable 'ansible_shell_executable' from source: unknown 30582 1726855365.63647: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.63654: variable 'ansible_pipelining' from source: unknown 30582 1726855365.63660: variable 'ansible_timeout' from source: unknown 30582 1726855365.63703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.63819: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855365.63837: variable 'omit' from source: magic vars 30582 1726855365.63847: starting attempt loop 30582 1726855365.63854: running the handler 30582 1726855365.63998: variable '__network_connections_result' from source: set_fact 30582 1726855365.64137: handler run complete 30582 1726855365.64140: attempt loop complete, returning result 30582 1726855365.64142: _execute() done 30582 1726855365.64145: dumping result to json 30582 1726855365.64147: done dumping result, returning 30582 1726855365.64149: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000001d3d] 30582 1726855365.64152: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3d 30582 1726855365.64222: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3d 30582 1726855365.64225: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30582 1726855365.64314: no more pending results, returning what we have 30582 1726855365.64318: results queue empty 30582 1726855365.64320: checking for any_errors_fatal 30582 1726855365.64327: done checking for any_errors_fatal 30582 1726855365.64327: checking for max_fail_percentage 30582 1726855365.64330: done checking for max_fail_percentage 30582 1726855365.64331: checking to see if all hosts have failed and the running result is not ok 30582 1726855365.64331: done checking to see if all hosts have failed 30582 1726855365.64332: getting the remaining hosts for this loop 30582 1726855365.64334: done getting the remaining hosts for this loop 30582 1726855365.64338: getting the next task for host managed_node3 30582 1726855365.64346: done getting next task for host managed_node3 30582 1726855365.64350: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855365.64356: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855365.64368: getting variables 30582 1726855365.64369: in VariableManager get_vars() 30582 1726855365.64415: Calling all_inventory to load vars for managed_node3 30582 1726855365.64418: Calling groups_inventory to load vars for managed_node3 30582 1726855365.64420: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855365.64431: Calling all_plugins_play to load vars for managed_node3 30582 1726855365.64434: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855365.64437: Calling groups_plugins_play to load vars for managed_node3 30582 1726855365.66130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855365.67777: done with get_vars() 30582 1726855365.67804: done getting variables 30582 1726855365.67868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:45 -0400 (0:00:00.063) 0:01:42.029 ****** 30582 1726855365.67916: entering _queue_task() for managed_node3/debug 30582 1726855365.68267: worker is 1 (out of 1 available) 30582 1726855365.68393: exiting _queue_task() for managed_node3/debug 30582 1726855365.68404: done queuing things up, now waiting for results queue to drain 30582 1726855365.68406: waiting for pending results... 30582 1726855365.68642: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855365.68820: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d3e 30582 1726855365.68824: variable 'ansible_search_path' from source: unknown 30582 1726855365.68827: variable 'ansible_search_path' from source: unknown 30582 1726855365.68830: calling self._execute() 30582 1726855365.68933: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.68945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.68962: variable 'omit' from source: magic vars 30582 1726855365.69362: variable 'ansible_distribution_major_version' from source: facts 30582 1726855365.69471: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855365.69475: variable 'omit' from source: magic vars 30582 1726855365.69478: variable 'omit' from source: magic vars 30582 1726855365.69517: variable 'omit' from source: magic vars 30582 1726855365.69561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855365.69609: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855365.69632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855365.69654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855365.69669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855365.69712: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855365.69720: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.69727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.69836: Set connection var ansible_timeout to 10 30582 1726855365.69843: Set connection var ansible_connection to ssh 30582 1726855365.69855: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855365.69863: Set connection var ansible_pipelining to False 30582 1726855365.69871: Set connection var ansible_shell_executable to /bin/sh 30582 1726855365.69876: Set connection var ansible_shell_type to sh 30582 1726855365.69909: variable 'ansible_shell_executable' from source: unknown 30582 1726855365.69917: variable 'ansible_connection' from source: unknown 30582 1726855365.70011: variable 'ansible_module_compression' from source: unknown 30582 1726855365.70015: variable 'ansible_shell_type' from source: unknown 30582 1726855365.70017: variable 'ansible_shell_executable' from source: unknown 30582 1726855365.70019: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.70021: variable 'ansible_pipelining' from source: unknown 30582 1726855365.70023: variable 'ansible_timeout' from source: unknown 30582 1726855365.70025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.70106: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855365.70128: variable 'omit' from source: magic vars 30582 1726855365.70142: starting attempt loop 30582 1726855365.70149: running the handler 30582 1726855365.70201: variable '__network_connections_result' from source: set_fact 30582 1726855365.70290: variable '__network_connections_result' from source: set_fact 30582 1726855365.70407: handler run complete 30582 1726855365.70434: attempt loop complete, returning result 30582 1726855365.70447: _execute() done 30582 1726855365.70454: dumping result to json 30582 1726855365.70464: done dumping result, returning 30582 1726855365.70477: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-000000001d3e] 30582 1726855365.70488: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3e 30582 1726855365.70625: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3e 30582 1726855365.70629: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30582 1726855365.70838: no more pending results, returning what we have 30582 1726855365.70841: results queue empty 30582 1726855365.70843: checking for any_errors_fatal 30582 1726855365.70851: done checking for any_errors_fatal 30582 1726855365.70852: checking for max_fail_percentage 30582 1726855365.70854: done checking for max_fail_percentage 30582 1726855365.70855: checking to see if all hosts have failed and the running result is not ok 30582 1726855365.70856: done checking to see if all hosts have failed 30582 1726855365.70857: getting the remaining hosts for this loop 30582 1726855365.70858: done getting the remaining hosts for this loop 30582 1726855365.70862: getting the next task for host managed_node3 30582 1726855365.70870: done getting next task for host managed_node3 30582 1726855365.70873: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855365.70878: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855365.70894: getting variables 30582 1726855365.70896: in VariableManager get_vars() 30582 1726855365.70934: Calling all_inventory to load vars for managed_node3 30582 1726855365.70937: Calling groups_inventory to load vars for managed_node3 30582 1726855365.70939: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855365.70952: Calling all_plugins_play to load vars for managed_node3 30582 1726855365.70955: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855365.70957: Calling groups_plugins_play to load vars for managed_node3 30582 1726855365.72774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855365.74437: done with get_vars() 30582 1726855365.74462: done getting variables 30582 1726855365.74530: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:45 -0400 (0:00:00.066) 0:01:42.095 ****** 30582 1726855365.74571: entering _queue_task() for managed_node3/debug 30582 1726855365.74945: worker is 1 (out of 1 available) 30582 1726855365.75195: exiting _queue_task() for managed_node3/debug 30582 1726855365.75206: done queuing things up, now waiting for results queue to drain 30582 1726855365.75208: waiting for pending results... 30582 1726855365.75407: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855365.75460: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d3f 30582 1726855365.75478: variable 'ansible_search_path' from source: unknown 30582 1726855365.75482: variable 'ansible_search_path' from source: unknown 30582 1726855365.75557: calling self._execute() 30582 1726855365.75623: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.75627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.75636: variable 'omit' from source: magic vars 30582 1726855365.76054: variable 'ansible_distribution_major_version' from source: facts 30582 1726855365.76102: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855365.76210: variable 'network_state' from source: role '' defaults 30582 1726855365.76217: Evaluated conditional (network_state != {}): False 30582 1726855365.76220: when evaluation is False, skipping this task 30582 1726855365.76223: _execute() done 30582 1726855365.76225: dumping result to json 30582 1726855365.76228: done dumping result, returning 30582 1726855365.76237: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-000000001d3f] 30582 1726855365.76293: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3f 30582 1726855365.76368: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d3f 30582 1726855365.76373: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855365.76422: no more pending results, returning what we have 30582 1726855365.76427: results queue empty 30582 1726855365.76428: checking for any_errors_fatal 30582 1726855365.76438: done checking for any_errors_fatal 30582 1726855365.76438: checking for max_fail_percentage 30582 1726855365.76441: done checking for max_fail_percentage 30582 1726855365.76442: checking to see if all hosts have failed and the running result is not ok 30582 1726855365.76442: done checking to see if all hosts have failed 30582 1726855365.76443: getting the remaining hosts for this loop 30582 1726855365.76445: done getting the remaining hosts for this loop 30582 1726855365.76449: getting the next task for host managed_node3 30582 1726855365.76456: done getting next task for host managed_node3 30582 1726855365.76460: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855365.76469: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855365.76500: getting variables 30582 1726855365.76502: in VariableManager get_vars() 30582 1726855365.76547: Calling all_inventory to load vars for managed_node3 30582 1726855365.76550: Calling groups_inventory to load vars for managed_node3 30582 1726855365.76552: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855365.76564: Calling all_plugins_play to load vars for managed_node3 30582 1726855365.76569: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855365.76572: Calling groups_plugins_play to load vars for managed_node3 30582 1726855365.78183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855365.79401: done with get_vars() 30582 1726855365.79419: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:45 -0400 (0:00:00.049) 0:01:42.144 ****** 30582 1726855365.79493: entering _queue_task() for managed_node3/ping 30582 1726855365.79742: worker is 1 (out of 1 available) 30582 1726855365.79756: exiting _queue_task() for managed_node3/ping 30582 1726855365.79770: done queuing things up, now waiting for results queue to drain 30582 1726855365.79772: waiting for pending results... 30582 1726855365.79962: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855365.80048: in run() - task 0affcc66-ac2b-aa83-7d57-000000001d40 30582 1726855365.80060: variable 'ansible_search_path' from source: unknown 30582 1726855365.80063: variable 'ansible_search_path' from source: unknown 30582 1726855365.80100: calling self._execute() 30582 1726855365.80179: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.80183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.80196: variable 'omit' from source: magic vars 30582 1726855365.80485: variable 'ansible_distribution_major_version' from source: facts 30582 1726855365.80496: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855365.80502: variable 'omit' from source: magic vars 30582 1726855365.80543: variable 'omit' from source: magic vars 30582 1726855365.80571: variable 'omit' from source: magic vars 30582 1726855365.80603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855365.80639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855365.80657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855365.80672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855365.80718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855365.80789: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855365.80793: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.80795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.80895: Set connection var ansible_timeout to 10 30582 1726855365.80898: Set connection var ansible_connection to ssh 30582 1726855365.80900: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855365.80902: Set connection var ansible_pipelining to False 30582 1726855365.80913: Set connection var ansible_shell_executable to /bin/sh 30582 1726855365.80918: Set connection var ansible_shell_type to sh 30582 1726855365.80920: variable 'ansible_shell_executable' from source: unknown 30582 1726855365.80922: variable 'ansible_connection' from source: unknown 30582 1726855365.80925: variable 'ansible_module_compression' from source: unknown 30582 1726855365.80927: variable 'ansible_shell_type' from source: unknown 30582 1726855365.80929: variable 'ansible_shell_executable' from source: unknown 30582 1726855365.80931: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855365.80933: variable 'ansible_pipelining' from source: unknown 30582 1726855365.80934: variable 'ansible_timeout' from source: unknown 30582 1726855365.80936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855365.81112: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855365.81117: variable 'omit' from source: magic vars 30582 1726855365.81119: starting attempt loop 30582 1726855365.81121: running the handler 30582 1726855365.81124: _low_level_execute_command(): starting 30582 1726855365.81134: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855365.81796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.81832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855365.81845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855365.81920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855365.83602: stdout chunk (state=3): >>>/root <<< 30582 1726855365.83703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855365.83740: stderr chunk (state=3): >>><<< 30582 1726855365.83747: stdout chunk (state=3): >>><<< 30582 1726855365.83812: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855365.83817: _low_level_execute_command(): starting 30582 1726855365.83820: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048 `" && echo ansible-tmp-1726855365.837732-35366-147132704944048="` echo /root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048 `" ) && sleep 0' 30582 1726855365.84355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855365.84392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855365.84395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855365.84398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855365.84408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855365.84419: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855365.84428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.84441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855365.84450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855365.84494: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855365.84497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855365.84499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855365.84502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855365.84504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855365.84511: stderr chunk (state=3): >>>debug2: match found <<< 30582 1726855365.84526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.84591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855365.84614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855365.84618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855365.84708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855365.86626: stdout chunk (state=3): >>>ansible-tmp-1726855365.837732-35366-147132704944048=/root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048 <<< 30582 1726855365.86738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855365.86799: stderr chunk (state=3): >>><<< 30582 1726855365.86808: stdout chunk (state=3): >>><<< 30582 1726855365.86835: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855365.837732-35366-147132704944048=/root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855365.86900: variable 'ansible_module_compression' from source: unknown 30582 1726855365.86996: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855365.87001: variable 'ansible_facts' from source: unknown 30582 1726855365.87106: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/AnsiballZ_ping.py 30582 1726855365.87356: Sending initial data 30582 1726855365.87359: Sent initial data (152 bytes) 30582 1726855365.88003: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.88061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855365.88081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855365.88105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855365.88191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855365.89829: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855365.89908: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855365.89975: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpc7js3xd3 /root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/AnsiballZ_ping.py <<< 30582 1726855365.89978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/AnsiballZ_ping.py" <<< 30582 1726855365.90046: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpc7js3xd3" to remote "/root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/AnsiballZ_ping.py" <<< 30582 1726855365.90845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855365.90877: stderr chunk (state=3): >>><<< 30582 1726855365.90886: stdout chunk (state=3): >>><<< 30582 1726855365.90913: done transferring module to remote 30582 1726855365.90930: _low_level_execute_command(): starting 30582 1726855365.90941: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/ /root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/AnsiballZ_ping.py && sleep 0' 30582 1726855365.91604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855365.91709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855365.91740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855365.91756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855365.91779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855365.91868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855365.93690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855365.93732: stdout chunk (state=3): >>><<< 30582 1726855365.93736: stderr chunk (state=3): >>><<< 30582 1726855365.93794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855365.93803: _low_level_execute_command(): starting 30582 1726855365.93806: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/AnsiballZ_ping.py && sleep 0' 30582 1726855365.94320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855365.94334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855365.94428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855366.09531: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855366.10936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855366.10940: stdout chunk (state=3): >>><<< 30582 1726855366.10943: stderr chunk (state=3): >>><<< 30582 1726855366.10972: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855366.11045: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855366.11049: _low_level_execute_command(): starting 30582 1726855366.11051: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855365.837732-35366-147132704944048/ > /dev/null 2>&1 && sleep 0' 30582 1726855366.11700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855366.11704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855366.11707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855366.11738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855366.11741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855366.11797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855366.11810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855366.11875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855366.13740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855366.13772: stderr chunk (state=3): >>><<< 30582 1726855366.13775: stdout chunk (state=3): >>><<< 30582 1726855366.13789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855366.13798: handler run complete 30582 1726855366.13810: attempt loop complete, returning result 30582 1726855366.13813: _execute() done 30582 1726855366.13815: dumping result to json 30582 1726855366.13817: done dumping result, returning 30582 1726855366.13826: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-000000001d40] 30582 1726855366.13831: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d40 30582 1726855366.13925: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001d40 30582 1726855366.13927: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855366.14033: no more pending results, returning what we have 30582 1726855366.14038: results queue empty 30582 1726855366.14039: checking for any_errors_fatal 30582 1726855366.14047: done checking for any_errors_fatal 30582 1726855366.14048: checking for max_fail_percentage 30582 1726855366.14050: done checking for max_fail_percentage 30582 1726855366.14051: checking to see if all hosts have failed and the running result is not ok 30582 1726855366.14051: done checking to see if all hosts have failed 30582 1726855366.14052: getting the remaining hosts for this loop 30582 1726855366.14053: done getting the remaining hosts for this loop 30582 1726855366.14057: getting the next task for host managed_node3 30582 1726855366.14069: done getting next task for host managed_node3 30582 1726855366.14071: ^ task is: TASK: meta (role_complete) 30582 1726855366.14076: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855366.14091: getting variables 30582 1726855366.14092: in VariableManager get_vars() 30582 1726855366.14133: Calling all_inventory to load vars for managed_node3 30582 1726855366.14137: Calling groups_inventory to load vars for managed_node3 30582 1726855366.14139: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855366.14148: Calling all_plugins_play to load vars for managed_node3 30582 1726855366.14151: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855366.14153: Calling groups_plugins_play to load vars for managed_node3 30582 1726855366.15531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855366.16431: done with get_vars() 30582 1726855366.16450: done getting variables 30582 1726855366.16520: done queuing things up, now waiting for results queue to drain 30582 1726855366.16522: results queue empty 30582 1726855366.16522: checking for any_errors_fatal 30582 1726855366.16524: done checking for any_errors_fatal 30582 1726855366.16525: checking for max_fail_percentage 30582 1726855366.16525: done checking for max_fail_percentage 30582 1726855366.16526: checking to see if all hosts have failed and the running result is not ok 30582 1726855366.16526: done checking to see if all hosts have failed 30582 1726855366.16527: getting the remaining hosts for this loop 30582 1726855366.16527: done getting the remaining hosts for this loop 30582 1726855366.16529: getting the next task for host managed_node3 30582 1726855366.16533: done getting next task for host managed_node3 30582 1726855366.16535: ^ task is: TASK: Asserts 30582 1726855366.16536: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855366.16538: getting variables 30582 1726855366.16539: in VariableManager get_vars() 30582 1726855366.16548: Calling all_inventory to load vars for managed_node3 30582 1726855366.16549: Calling groups_inventory to load vars for managed_node3 30582 1726855366.16550: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855366.16554: Calling all_plugins_play to load vars for managed_node3 30582 1726855366.16555: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855366.16557: Calling groups_plugins_play to load vars for managed_node3 30582 1726855366.17235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855366.18483: done with get_vars() 30582 1726855366.18507: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 14:02:46 -0400 (0:00:00.390) 0:01:42.535 ****** 30582 1726855366.18591: entering _queue_task() for managed_node3/include_tasks 30582 1726855366.19217: worker is 1 (out of 1 available) 30582 1726855366.19228: exiting _queue_task() for managed_node3/include_tasks 30582 1726855366.19238: done queuing things up, now waiting for results queue to drain 30582 1726855366.19239: waiting for pending results... 30582 1726855366.19481: running TaskExecutor() for managed_node3/TASK: Asserts 30582 1726855366.19486: in run() - task 0affcc66-ac2b-aa83-7d57-000000001749 30582 1726855366.19492: variable 'ansible_search_path' from source: unknown 30582 1726855366.19502: variable 'ansible_search_path' from source: unknown 30582 1726855366.19551: variable 'lsr_assert' from source: include params 30582 1726855366.19804: variable 'lsr_assert' from source: include params 30582 1726855366.19882: variable 'omit' from source: magic vars 30582 1726855366.20041: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.20056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.20122: variable 'omit' from source: magic vars 30582 1726855366.20328: variable 'ansible_distribution_major_version' from source: facts 30582 1726855366.20449: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855366.20453: variable 'item' from source: unknown 30582 1726855366.20455: variable 'item' from source: unknown 30582 1726855366.20558: variable 'item' from source: unknown 30582 1726855366.20562: variable 'item' from source: unknown 30582 1726855366.20802: dumping result to json 30582 1726855366.20806: done dumping result, returning 30582 1726855366.20808: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcc66-ac2b-aa83-7d57-000000001749] 30582 1726855366.20810: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001749 30582 1726855366.20856: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001749 30582 1726855366.20859: WORKER PROCESS EXITING 30582 1726855366.20928: no more pending results, returning what we have 30582 1726855366.20934: in VariableManager get_vars() 30582 1726855366.20993: Calling all_inventory to load vars for managed_node3 30582 1726855366.20996: Calling groups_inventory to load vars for managed_node3 30582 1726855366.20999: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855366.21014: Calling all_plugins_play to load vars for managed_node3 30582 1726855366.21018: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855366.21021: Calling groups_plugins_play to load vars for managed_node3 30582 1726855366.22706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855366.24426: done with get_vars() 30582 1726855366.24460: variable 'ansible_search_path' from source: unknown 30582 1726855366.24462: variable 'ansible_search_path' from source: unknown 30582 1726855366.24511: we have included files to process 30582 1726855366.24512: generating all_blocks data 30582 1726855366.24514: done generating all_blocks data 30582 1726855366.24520: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30582 1726855366.24521: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30582 1726855366.24524: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30582 1726855366.24655: in VariableManager get_vars() 30582 1726855366.24682: done with get_vars() 30582 1726855366.24808: done processing included file 30582 1726855366.24811: iterating over new_blocks loaded from include file 30582 1726855366.24812: in VariableManager get_vars() 30582 1726855366.24830: done with get_vars() 30582 1726855366.24832: filtering new block on tags 30582 1726855366.24879: done filtering new block on tags 30582 1726855366.24882: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 => (item=tasks/assert_profile_absent.yml) 30582 1726855366.24889: extending task lists for all hosts with included blocks 30582 1726855366.26505: done extending task lists 30582 1726855366.26507: done processing included files 30582 1726855366.26508: results queue empty 30582 1726855366.26508: checking for any_errors_fatal 30582 1726855366.26510: done checking for any_errors_fatal 30582 1726855366.26511: checking for max_fail_percentage 30582 1726855366.26512: done checking for max_fail_percentage 30582 1726855366.26513: checking to see if all hosts have failed and the running result is not ok 30582 1726855366.26514: done checking to see if all hosts have failed 30582 1726855366.26515: getting the remaining hosts for this loop 30582 1726855366.26516: done getting the remaining hosts for this loop 30582 1726855366.26519: getting the next task for host managed_node3 30582 1726855366.26524: done getting next task for host managed_node3 30582 1726855366.26526: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30582 1726855366.26529: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855366.26532: getting variables 30582 1726855366.26533: in VariableManager get_vars() 30582 1726855366.26548: Calling all_inventory to load vars for managed_node3 30582 1726855366.26551: Calling groups_inventory to load vars for managed_node3 30582 1726855366.26554: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855366.26560: Calling all_plugins_play to load vars for managed_node3 30582 1726855366.26563: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855366.26570: Calling groups_plugins_play to load vars for managed_node3 30582 1726855366.28483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855366.30418: done with get_vars() 30582 1726855366.30447: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 14:02:46 -0400 (0:00:00.120) 0:01:42.656 ****** 30582 1726855366.30629: entering _queue_task() for managed_node3/include_tasks 30582 1726855366.31401: worker is 1 (out of 1 available) 30582 1726855366.31413: exiting _queue_task() for managed_node3/include_tasks 30582 1726855366.31424: done queuing things up, now waiting for results queue to drain 30582 1726855366.31426: waiting for pending results... 30582 1726855366.31730: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30582 1726855366.31930: in run() - task 0affcc66-ac2b-aa83-7d57-000000001e99 30582 1726855366.31934: variable 'ansible_search_path' from source: unknown 30582 1726855366.31937: variable 'ansible_search_path' from source: unknown 30582 1726855366.31939: calling self._execute() 30582 1726855366.31992: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.32004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.32018: variable 'omit' from source: magic vars 30582 1726855366.32432: variable 'ansible_distribution_major_version' from source: facts 30582 1726855366.32449: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855366.32458: _execute() done 30582 1726855366.32475: dumping result to json 30582 1726855366.32485: done dumping result, returning 30582 1726855366.32499: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-aa83-7d57-000000001e99] 30582 1726855366.32509: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001e99 30582 1726855366.32755: no more pending results, returning what we have 30582 1726855366.32762: in VariableManager get_vars() 30582 1726855366.32819: Calling all_inventory to load vars for managed_node3 30582 1726855366.32823: Calling groups_inventory to load vars for managed_node3 30582 1726855366.32826: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855366.32841: Calling all_plugins_play to load vars for managed_node3 30582 1726855366.32845: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855366.32849: Calling groups_plugins_play to load vars for managed_node3 30582 1726855366.33400: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001e99 30582 1726855366.33404: WORKER PROCESS EXITING 30582 1726855366.34541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855366.36209: done with get_vars() 30582 1726855366.36235: variable 'ansible_search_path' from source: unknown 30582 1726855366.36236: variable 'ansible_search_path' from source: unknown 30582 1726855366.36245: variable 'item' from source: include params 30582 1726855366.36367: variable 'item' from source: include params 30582 1726855366.36408: we have included files to process 30582 1726855366.36410: generating all_blocks data 30582 1726855366.36411: done generating all_blocks data 30582 1726855366.36413: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855366.36414: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855366.36417: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855366.37402: done processing included file 30582 1726855366.37405: iterating over new_blocks loaded from include file 30582 1726855366.37406: in VariableManager get_vars() 30582 1726855366.37427: done with get_vars() 30582 1726855366.37429: filtering new block on tags 30582 1726855366.37508: done filtering new block on tags 30582 1726855366.37512: in VariableManager get_vars() 30582 1726855366.37528: done with get_vars() 30582 1726855366.37530: filtering new block on tags 30582 1726855366.37597: done filtering new block on tags 30582 1726855366.37600: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30582 1726855366.37605: extending task lists for all hosts with included blocks 30582 1726855366.37954: done extending task lists 30582 1726855366.37956: done processing included files 30582 1726855366.37956: results queue empty 30582 1726855366.37957: checking for any_errors_fatal 30582 1726855366.37962: done checking for any_errors_fatal 30582 1726855366.37963: checking for max_fail_percentage 30582 1726855366.37964: done checking for max_fail_percentage 30582 1726855366.37967: checking to see if all hosts have failed and the running result is not ok 30582 1726855366.37968: done checking to see if all hosts have failed 30582 1726855366.37969: getting the remaining hosts for this loop 30582 1726855366.37970: done getting the remaining hosts for this loop 30582 1726855366.37973: getting the next task for host managed_node3 30582 1726855366.37977: done getting next task for host managed_node3 30582 1726855366.37979: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855366.37983: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855366.37985: getting variables 30582 1726855366.37986: in VariableManager get_vars() 30582 1726855366.37998: Calling all_inventory to load vars for managed_node3 30582 1726855366.38000: Calling groups_inventory to load vars for managed_node3 30582 1726855366.38002: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855366.38012: Calling all_plugins_play to load vars for managed_node3 30582 1726855366.38015: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855366.38017: Calling groups_plugins_play to load vars for managed_node3 30582 1726855366.39203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855366.40830: done with get_vars() 30582 1726855366.40858: done getting variables 30582 1726855366.40909: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 14:02:46 -0400 (0:00:00.103) 0:01:42.759 ****** 30582 1726855366.40948: entering _queue_task() for managed_node3/set_fact 30582 1726855366.41472: worker is 1 (out of 1 available) 30582 1726855366.41484: exiting _queue_task() for managed_node3/set_fact 30582 1726855366.41498: done queuing things up, now waiting for results queue to drain 30582 1726855366.41499: waiting for pending results... 30582 1726855366.41712: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855366.41847: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f17 30582 1726855366.41872: variable 'ansible_search_path' from source: unknown 30582 1726855366.41880: variable 'ansible_search_path' from source: unknown 30582 1726855366.41927: calling self._execute() 30582 1726855366.42116: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.42122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.42126: variable 'omit' from source: magic vars 30582 1726855366.42492: variable 'ansible_distribution_major_version' from source: facts 30582 1726855366.42510: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855366.42521: variable 'omit' from source: magic vars 30582 1726855366.42589: variable 'omit' from source: magic vars 30582 1726855366.42629: variable 'omit' from source: magic vars 30582 1726855366.42684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855366.42724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855366.42749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855366.42883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855366.42886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855366.42891: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855366.42893: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.42895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.43136: Set connection var ansible_timeout to 10 30582 1726855366.43210: Set connection var ansible_connection to ssh 30582 1726855366.43215: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855366.43226: Set connection var ansible_pipelining to False 30582 1726855366.43236: Set connection var ansible_shell_executable to /bin/sh 30582 1726855366.43247: Set connection var ansible_shell_type to sh 30582 1726855366.43280: variable 'ansible_shell_executable' from source: unknown 30582 1726855366.43327: variable 'ansible_connection' from source: unknown 30582 1726855366.43336: variable 'ansible_module_compression' from source: unknown 30582 1726855366.43357: variable 'ansible_shell_type' from source: unknown 30582 1726855366.43430: variable 'ansible_shell_executable' from source: unknown 30582 1726855366.43433: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.43435: variable 'ansible_pipelining' from source: unknown 30582 1726855366.43437: variable 'ansible_timeout' from source: unknown 30582 1726855366.43439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.43898: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855366.43902: variable 'omit' from source: magic vars 30582 1726855366.43904: starting attempt loop 30582 1726855366.43907: running the handler 30582 1726855366.43909: handler run complete 30582 1726855366.43910: attempt loop complete, returning result 30582 1726855366.43912: _execute() done 30582 1726855366.43914: dumping result to json 30582 1726855366.43916: done dumping result, returning 30582 1726855366.43918: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-aa83-7d57-000000001f17] 30582 1726855366.43920: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f17 30582 1726855366.44186: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f17 30582 1726855366.44192: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30582 1726855366.44254: no more pending results, returning what we have 30582 1726855366.44259: results queue empty 30582 1726855366.44260: checking for any_errors_fatal 30582 1726855366.44262: done checking for any_errors_fatal 30582 1726855366.44262: checking for max_fail_percentage 30582 1726855366.44267: done checking for max_fail_percentage 30582 1726855366.44268: checking to see if all hosts have failed and the running result is not ok 30582 1726855366.44268: done checking to see if all hosts have failed 30582 1726855366.44269: getting the remaining hosts for this loop 30582 1726855366.44271: done getting the remaining hosts for this loop 30582 1726855366.44275: getting the next task for host managed_node3 30582 1726855366.44282: done getting next task for host managed_node3 30582 1726855366.44285: ^ task is: TASK: Stat profile file 30582 1726855366.44294: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855366.44300: getting variables 30582 1726855366.44302: in VariableManager get_vars() 30582 1726855366.44351: Calling all_inventory to load vars for managed_node3 30582 1726855366.44355: Calling groups_inventory to load vars for managed_node3 30582 1726855366.44358: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855366.44376: Calling all_plugins_play to load vars for managed_node3 30582 1726855366.44381: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855366.44385: Calling groups_plugins_play to load vars for managed_node3 30582 1726855366.47617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855366.51291: done with get_vars() 30582 1726855366.51327: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 14:02:46 -0400 (0:00:00.106) 0:01:42.866 ****** 30582 1726855366.51636: entering _queue_task() for managed_node3/stat 30582 1726855366.52334: worker is 1 (out of 1 available) 30582 1726855366.52346: exiting _queue_task() for managed_node3/stat 30582 1726855366.52357: done queuing things up, now waiting for results queue to drain 30582 1726855366.52358: waiting for pending results... 30582 1726855366.52752: running TaskExecutor() for managed_node3/TASK: Stat profile file 30582 1726855366.52843: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f18 30582 1726855366.52873: variable 'ansible_search_path' from source: unknown 30582 1726855366.52961: variable 'ansible_search_path' from source: unknown 30582 1726855366.52968: calling self._execute() 30582 1726855366.53032: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.53044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.53061: variable 'omit' from source: magic vars 30582 1726855366.53467: variable 'ansible_distribution_major_version' from source: facts 30582 1726855366.53486: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855366.53503: variable 'omit' from source: magic vars 30582 1726855366.53568: variable 'omit' from source: magic vars 30582 1726855366.53677: variable 'profile' from source: play vars 30582 1726855366.53689: variable 'interface' from source: play vars 30582 1726855366.53892: variable 'interface' from source: play vars 30582 1726855366.53896: variable 'omit' from source: magic vars 30582 1726855366.53898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855366.53901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855366.53903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855366.53905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855366.53908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855366.53943: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855366.53947: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.53954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.54147: Set connection var ansible_timeout to 10 30582 1726855366.54151: Set connection var ansible_connection to ssh 30582 1726855366.54155: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855366.54157: Set connection var ansible_pipelining to False 30582 1726855366.54159: Set connection var ansible_shell_executable to /bin/sh 30582 1726855366.54162: Set connection var ansible_shell_type to sh 30582 1726855366.54163: variable 'ansible_shell_executable' from source: unknown 30582 1726855366.54165: variable 'ansible_connection' from source: unknown 30582 1726855366.54167: variable 'ansible_module_compression' from source: unknown 30582 1726855366.54173: variable 'ansible_shell_type' from source: unknown 30582 1726855366.54175: variable 'ansible_shell_executable' from source: unknown 30582 1726855366.54177: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.54178: variable 'ansible_pipelining' from source: unknown 30582 1726855366.54181: variable 'ansible_timeout' from source: unknown 30582 1726855366.54183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.54366: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855366.54371: variable 'omit' from source: magic vars 30582 1726855366.54373: starting attempt loop 30582 1726855366.54376: running the handler 30582 1726855366.54378: _low_level_execute_command(): starting 30582 1726855366.54380: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855366.55120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855366.55128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855366.55139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855366.55159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855366.55314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855366.55320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855366.55385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855366.57083: stdout chunk (state=3): >>>/root <<< 30582 1726855366.57273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855366.57278: stdout chunk (state=3): >>><<< 30582 1726855366.57293: stderr chunk (state=3): >>><<< 30582 1726855366.57513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855366.57525: _low_level_execute_command(): starting 30582 1726855366.57536: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965 `" && echo ansible-tmp-1726855366.5751107-35400-180815340233965="` echo /root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965 `" ) && sleep 0' 30582 1726855366.58814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855366.58973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855366.58994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855366.59009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855366.59255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855366.61111: stdout chunk (state=3): >>>ansible-tmp-1726855366.5751107-35400-180815340233965=/root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965 <<< 30582 1726855366.61252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855366.61264: stdout chunk (state=3): >>><<< 30582 1726855366.61276: stderr chunk (state=3): >>><<< 30582 1726855366.61313: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855366.5751107-35400-180815340233965=/root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855366.61378: variable 'ansible_module_compression' from source: unknown 30582 1726855366.61562: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855366.61608: variable 'ansible_facts' from source: unknown 30582 1726855366.61911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/AnsiballZ_stat.py 30582 1726855366.62120: Sending initial data 30582 1726855366.62123: Sent initial data (153 bytes) 30582 1726855366.62598: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855366.62606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855366.62637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855366.62641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855366.62645: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855366.62703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855366.62706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855366.62779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855366.64509: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855366.64558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855366.64633: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpfofpm4bo /root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/AnsiballZ_stat.py <<< 30582 1726855366.64638: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/AnsiballZ_stat.py" <<< 30582 1726855366.64692: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpfofpm4bo" to remote "/root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/AnsiballZ_stat.py" <<< 30582 1726855366.65966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855366.66010: stderr chunk (state=3): >>><<< 30582 1726855366.66014: stdout chunk (state=3): >>><<< 30582 1726855366.66043: done transferring module to remote 30582 1726855366.66052: _low_level_execute_command(): starting 30582 1726855366.66057: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/ /root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/AnsiballZ_stat.py && sleep 0' 30582 1726855366.66783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855366.66802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855366.66922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855366.68685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855366.68713: stderr chunk (state=3): >>><<< 30582 1726855366.68717: stdout chunk (state=3): >>><<< 30582 1726855366.68731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855366.68734: _low_level_execute_command(): starting 30582 1726855366.68739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/AnsiballZ_stat.py && sleep 0' 30582 1726855366.69384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855366.69429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855366.69506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855366.84694: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855366.86071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855366.86091: stderr chunk (state=3): >>><<< 30582 1726855366.86095: stdout chunk (state=3): >>><<< 30582 1726855366.86112: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855366.86150: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855366.86158: _low_level_execute_command(): starting 30582 1726855366.86163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855366.5751107-35400-180815340233965/ > /dev/null 2>&1 && sleep 0' 30582 1726855366.86621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855366.86624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855366.86627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855366.86629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855366.86631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855366.86679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855366.86682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855366.86748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855366.88701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855366.88705: stdout chunk (state=3): >>><<< 30582 1726855366.88707: stderr chunk (state=3): >>><<< 30582 1726855366.88723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855366.88894: handler run complete 30582 1726855366.88897: attempt loop complete, returning result 30582 1726855366.88898: _execute() done 30582 1726855366.88900: dumping result to json 30582 1726855366.88902: done dumping result, returning 30582 1726855366.88903: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcc66-ac2b-aa83-7d57-000000001f18] 30582 1726855366.88905: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f18 30582 1726855366.88977: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f18 30582 1726855366.88981: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855366.89039: no more pending results, returning what we have 30582 1726855366.89042: results queue empty 30582 1726855366.89043: checking for any_errors_fatal 30582 1726855366.89051: done checking for any_errors_fatal 30582 1726855366.89053: checking for max_fail_percentage 30582 1726855366.89055: done checking for max_fail_percentage 30582 1726855366.89056: checking to see if all hosts have failed and the running result is not ok 30582 1726855366.89057: done checking to see if all hosts have failed 30582 1726855366.89059: getting the remaining hosts for this loop 30582 1726855366.89064: done getting the remaining hosts for this loop 30582 1726855366.89068: getting the next task for host managed_node3 30582 1726855366.89076: done getting next task for host managed_node3 30582 1726855366.89078: ^ task is: TASK: Set NM profile exist flag based on the profile files 30582 1726855366.89083: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855366.89089: getting variables 30582 1726855366.89091: in VariableManager get_vars() 30582 1726855366.89134: Calling all_inventory to load vars for managed_node3 30582 1726855366.89137: Calling groups_inventory to load vars for managed_node3 30582 1726855366.89141: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855366.89151: Calling all_plugins_play to load vars for managed_node3 30582 1726855366.89153: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855366.89156: Calling groups_plugins_play to load vars for managed_node3 30582 1726855366.90701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855366.91750: done with get_vars() 30582 1726855366.91774: done getting variables 30582 1726855366.91821: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 14:02:46 -0400 (0:00:00.402) 0:01:43.268 ****** 30582 1726855366.91847: entering _queue_task() for managed_node3/set_fact 30582 1726855366.92119: worker is 1 (out of 1 available) 30582 1726855366.92132: exiting _queue_task() for managed_node3/set_fact 30582 1726855366.92145: done queuing things up, now waiting for results queue to drain 30582 1726855366.92146: waiting for pending results... 30582 1726855366.92332: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30582 1726855366.92432: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f19 30582 1726855366.92443: variable 'ansible_search_path' from source: unknown 30582 1726855366.92446: variable 'ansible_search_path' from source: unknown 30582 1726855366.92481: calling self._execute() 30582 1726855366.92551: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.92556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.92564: variable 'omit' from source: magic vars 30582 1726855366.92928: variable 'ansible_distribution_major_version' from source: facts 30582 1726855366.92931: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855366.93117: variable 'profile_stat' from source: set_fact 30582 1726855366.93121: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855366.93124: when evaluation is False, skipping this task 30582 1726855366.93126: _execute() done 30582 1726855366.93128: dumping result to json 30582 1726855366.93131: done dumping result, returning 30582 1726855366.93133: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-aa83-7d57-000000001f19] 30582 1726855366.93135: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f19 30582 1726855366.93199: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f19 30582 1726855366.93202: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855366.93260: no more pending results, returning what we have 30582 1726855366.93264: results queue empty 30582 1726855366.93265: checking for any_errors_fatal 30582 1726855366.93276: done checking for any_errors_fatal 30582 1726855366.93277: checking for max_fail_percentage 30582 1726855366.93279: done checking for max_fail_percentage 30582 1726855366.93280: checking to see if all hosts have failed and the running result is not ok 30582 1726855366.93281: done checking to see if all hosts have failed 30582 1726855366.93281: getting the remaining hosts for this loop 30582 1726855366.93283: done getting the remaining hosts for this loop 30582 1726855366.93286: getting the next task for host managed_node3 30582 1726855366.93295: done getting next task for host managed_node3 30582 1726855366.93297: ^ task is: TASK: Get NM profile info 30582 1726855366.93302: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855366.93306: getting variables 30582 1726855366.93308: in VariableManager get_vars() 30582 1726855366.93347: Calling all_inventory to load vars for managed_node3 30582 1726855366.93349: Calling groups_inventory to load vars for managed_node3 30582 1726855366.93352: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855366.93363: Calling all_plugins_play to load vars for managed_node3 30582 1726855366.93366: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855366.93368: Calling groups_plugins_play to load vars for managed_node3 30582 1726855366.94598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855366.95470: done with get_vars() 30582 1726855366.95490: done getting variables 30582 1726855366.95533: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 14:02:46 -0400 (0:00:00.037) 0:01:43.305 ****** 30582 1726855366.95557: entering _queue_task() for managed_node3/shell 30582 1726855366.95828: worker is 1 (out of 1 available) 30582 1726855366.95842: exiting _queue_task() for managed_node3/shell 30582 1726855366.95855: done queuing things up, now waiting for results queue to drain 30582 1726855366.95857: waiting for pending results... 30582 1726855366.96205: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30582 1726855366.96234: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f1a 30582 1726855366.96258: variable 'ansible_search_path' from source: unknown 30582 1726855366.96268: variable 'ansible_search_path' from source: unknown 30582 1726855366.96322: calling self._execute() 30582 1726855366.96427: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.96439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.96454: variable 'omit' from source: magic vars 30582 1726855366.96861: variable 'ansible_distribution_major_version' from source: facts 30582 1726855366.96884: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855366.96899: variable 'omit' from source: magic vars 30582 1726855366.96960: variable 'omit' from source: magic vars 30582 1726855366.97192: variable 'profile' from source: play vars 30582 1726855366.97196: variable 'interface' from source: play vars 30582 1726855366.97199: variable 'interface' from source: play vars 30582 1726855366.97201: variable 'omit' from source: magic vars 30582 1726855366.97225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855366.97270: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855366.97300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855366.97323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855366.97341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855366.97384: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855366.97396: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.97403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.97521: Set connection var ansible_timeout to 10 30582 1726855366.97529: Set connection var ansible_connection to ssh 30582 1726855366.97542: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855366.97550: Set connection var ansible_pipelining to False 30582 1726855366.97559: Set connection var ansible_shell_executable to /bin/sh 30582 1726855366.97569: Set connection var ansible_shell_type to sh 30582 1726855366.97599: variable 'ansible_shell_executable' from source: unknown 30582 1726855366.97607: variable 'ansible_connection' from source: unknown 30582 1726855366.97614: variable 'ansible_module_compression' from source: unknown 30582 1726855366.97620: variable 'ansible_shell_type' from source: unknown 30582 1726855366.97626: variable 'ansible_shell_executable' from source: unknown 30582 1726855366.97632: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855366.97640: variable 'ansible_pipelining' from source: unknown 30582 1726855366.97646: variable 'ansible_timeout' from source: unknown 30582 1726855366.97653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855366.97818: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855366.97992: variable 'omit' from source: magic vars 30582 1726855366.97995: starting attempt loop 30582 1726855366.97997: running the handler 30582 1726855366.98000: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855366.98002: _low_level_execute_command(): starting 30582 1726855366.98003: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855366.98574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855366.98684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855366.98711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855366.98752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855366.98812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.00519: stdout chunk (state=3): >>>/root <<< 30582 1726855367.00658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855367.00681: stdout chunk (state=3): >>><<< 30582 1726855367.00698: stderr chunk (state=3): >>><<< 30582 1726855367.00725: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855367.00748: _low_level_execute_command(): starting 30582 1726855367.00762: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317 `" && echo ansible-tmp-1726855367.0073245-35421-101409184415317="` echo /root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317 `" ) && sleep 0' 30582 1726855367.01505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.01562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855367.01583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855367.01608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.01701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.03636: stdout chunk (state=3): >>>ansible-tmp-1726855367.0073245-35421-101409184415317=/root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317 <<< 30582 1726855367.03789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855367.03820: stdout chunk (state=3): >>><<< 30582 1726855367.03824: stderr chunk (state=3): >>><<< 30582 1726855367.03993: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855367.0073245-35421-101409184415317=/root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855367.03997: variable 'ansible_module_compression' from source: unknown 30582 1726855367.03999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855367.04001: variable 'ansible_facts' from source: unknown 30582 1726855367.04081: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/AnsiballZ_command.py 30582 1726855367.04247: Sending initial data 30582 1726855367.04258: Sent initial data (156 bytes) 30582 1726855367.04922: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855367.04938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855367.04955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855367.05007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855367.05022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855367.05114: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855367.05137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.05232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.06807: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855367.06879: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855367.06933: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp80m87gar /root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/AnsiballZ_command.py <<< 30582 1726855367.06936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/AnsiballZ_command.py" <<< 30582 1726855367.07020: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp80m87gar" to remote "/root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/AnsiballZ_command.py" <<< 30582 1726855367.07860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855367.07952: stderr chunk (state=3): >>><<< 30582 1726855367.07955: stdout chunk (state=3): >>><<< 30582 1726855367.07958: done transferring module to remote 30582 1726855367.07976: _low_level_execute_command(): starting 30582 1726855367.08064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/ /root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/AnsiballZ_command.py && sleep 0' 30582 1726855367.08700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.08764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855367.08783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855367.08805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.08907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.10685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855367.10695: stderr chunk (state=3): >>><<< 30582 1726855367.10698: stdout chunk (state=3): >>><<< 30582 1726855367.10711: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855367.10714: _low_level_execute_command(): starting 30582 1726855367.10720: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/AnsiballZ_command.py && sleep 0' 30582 1726855367.11154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855367.11157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.11160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855367.11162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855367.11164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.11208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855367.11212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.11280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.28099: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:02:47.263106", "end": "2024-09-20 14:02:47.279988", "delta": "0:00:00.016882", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855367.29570: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. <<< 30582 1726855367.29591: stderr chunk (state=3): >>><<< 30582 1726855367.29594: stdout chunk (state=3): >>><<< 30582 1726855367.29611: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:02:47.263106", "end": "2024-09-20 14:02:47.279988", "delta": "0:00:00.016882", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855367.29640: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855367.29647: _low_level_execute_command(): starting 30582 1726855367.29652: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855367.0073245-35421-101409184415317/ > /dev/null 2>&1 && sleep 0' 30582 1726855367.30082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855367.30092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.30120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855367.30123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.30175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855367.30181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855367.30182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.30240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.32063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855367.32092: stderr chunk (state=3): >>><<< 30582 1726855367.32095: stdout chunk (state=3): >>><<< 30582 1726855367.32108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855367.32115: handler run complete 30582 1726855367.32134: Evaluated conditional (False): False 30582 1726855367.32142: attempt loop complete, returning result 30582 1726855367.32145: _execute() done 30582 1726855367.32147: dumping result to json 30582 1726855367.32152: done dumping result, returning 30582 1726855367.32160: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcc66-ac2b-aa83-7d57-000000001f1a] 30582 1726855367.32167: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1a 30582 1726855367.32267: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1a 30582 1726855367.32270: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016882", "end": "2024-09-20 14:02:47.279988", "rc": 1, "start": "2024-09-20 14:02:47.263106" } MSG: non-zero return code ...ignoring 30582 1726855367.32361: no more pending results, returning what we have 30582 1726855367.32368: results queue empty 30582 1726855367.32369: checking for any_errors_fatal 30582 1726855367.32378: done checking for any_errors_fatal 30582 1726855367.32378: checking for max_fail_percentage 30582 1726855367.32380: done checking for max_fail_percentage 30582 1726855367.32381: checking to see if all hosts have failed and the running result is not ok 30582 1726855367.32382: done checking to see if all hosts have failed 30582 1726855367.32382: getting the remaining hosts for this loop 30582 1726855367.32384: done getting the remaining hosts for this loop 30582 1726855367.32389: getting the next task for host managed_node3 30582 1726855367.32397: done getting next task for host managed_node3 30582 1726855367.32400: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855367.32405: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855367.32408: getting variables 30582 1726855367.32410: in VariableManager get_vars() 30582 1726855367.32448: Calling all_inventory to load vars for managed_node3 30582 1726855367.32450: Calling groups_inventory to load vars for managed_node3 30582 1726855367.32453: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.32464: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.32469: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.32472: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.33360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.34896: done with get_vars() 30582 1726855367.34920: done getting variables 30582 1726855367.34978: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 14:02:47 -0400 (0:00:00.394) 0:01:43.699 ****** 30582 1726855367.35007: entering _queue_task() for managed_node3/set_fact 30582 1726855367.35279: worker is 1 (out of 1 available) 30582 1726855367.35295: exiting _queue_task() for managed_node3/set_fact 30582 1726855367.35307: done queuing things up, now waiting for results queue to drain 30582 1726855367.35309: waiting for pending results... 30582 1726855367.35496: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855367.35580: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f1b 30582 1726855367.35596: variable 'ansible_search_path' from source: unknown 30582 1726855367.35599: variable 'ansible_search_path' from source: unknown 30582 1726855367.35628: calling self._execute() 30582 1726855367.35705: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.35709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.35716: variable 'omit' from source: magic vars 30582 1726855367.35998: variable 'ansible_distribution_major_version' from source: facts 30582 1726855367.36007: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855367.36100: variable 'nm_profile_exists' from source: set_fact 30582 1726855367.36115: Evaluated conditional (nm_profile_exists.rc == 0): False 30582 1726855367.36118: when evaluation is False, skipping this task 30582 1726855367.36121: _execute() done 30582 1726855367.36124: dumping result to json 30582 1726855367.36126: done dumping result, returning 30582 1726855367.36134: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-aa83-7d57-000000001f1b] 30582 1726855367.36138: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1b 30582 1726855367.36225: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1b 30582 1726855367.36228: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30582 1726855367.36277: no more pending results, returning what we have 30582 1726855367.36281: results queue empty 30582 1726855367.36281: checking for any_errors_fatal 30582 1726855367.36292: done checking for any_errors_fatal 30582 1726855367.36292: checking for max_fail_percentage 30582 1726855367.36294: done checking for max_fail_percentage 30582 1726855367.36295: checking to see if all hosts have failed and the running result is not ok 30582 1726855367.36296: done checking to see if all hosts have failed 30582 1726855367.36297: getting the remaining hosts for this loop 30582 1726855367.36298: done getting the remaining hosts for this loop 30582 1726855367.36302: getting the next task for host managed_node3 30582 1726855367.36313: done getting next task for host managed_node3 30582 1726855367.36317: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855367.36322: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855367.36327: getting variables 30582 1726855367.36328: in VariableManager get_vars() 30582 1726855367.36373: Calling all_inventory to load vars for managed_node3 30582 1726855367.36376: Calling groups_inventory to load vars for managed_node3 30582 1726855367.36379: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.36397: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.36400: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.36403: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.37902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.39317: done with get_vars() 30582 1726855367.39339: done getting variables 30582 1726855367.39386: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855367.39474: variable 'profile' from source: play vars 30582 1726855367.39477: variable 'interface' from source: play vars 30582 1726855367.39520: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 14:02:47 -0400 (0:00:00.045) 0:01:43.745 ****** 30582 1726855367.39545: entering _queue_task() for managed_node3/command 30582 1726855367.39814: worker is 1 (out of 1 available) 30582 1726855367.39827: exiting _queue_task() for managed_node3/command 30582 1726855367.39840: done queuing things up, now waiting for results queue to drain 30582 1726855367.39842: waiting for pending results... 30582 1726855367.40033: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30582 1726855367.40128: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f1d 30582 1726855367.40141: variable 'ansible_search_path' from source: unknown 30582 1726855367.40145: variable 'ansible_search_path' from source: unknown 30582 1726855367.40177: calling self._execute() 30582 1726855367.40248: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.40253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.40262: variable 'omit' from source: magic vars 30582 1726855367.40538: variable 'ansible_distribution_major_version' from source: facts 30582 1726855367.40548: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855367.40635: variable 'profile_stat' from source: set_fact 30582 1726855367.40644: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855367.40647: when evaluation is False, skipping this task 30582 1726855367.40651: _execute() done 30582 1726855367.40655: dumping result to json 30582 1726855367.40657: done dumping result, returning 30582 1726855367.40662: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000001f1d] 30582 1726855367.40670: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1d 30582 1726855367.40754: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1d 30582 1726855367.40757: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855367.40807: no more pending results, returning what we have 30582 1726855367.40811: results queue empty 30582 1726855367.40812: checking for any_errors_fatal 30582 1726855367.40819: done checking for any_errors_fatal 30582 1726855367.40820: checking for max_fail_percentage 30582 1726855367.40821: done checking for max_fail_percentage 30582 1726855367.40822: checking to see if all hosts have failed and the running result is not ok 30582 1726855367.40823: done checking to see if all hosts have failed 30582 1726855367.40824: getting the remaining hosts for this loop 30582 1726855367.40825: done getting the remaining hosts for this loop 30582 1726855367.40828: getting the next task for host managed_node3 30582 1726855367.40836: done getting next task for host managed_node3 30582 1726855367.40839: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855367.40844: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855367.40849: getting variables 30582 1726855367.40851: in VariableManager get_vars() 30582 1726855367.40900: Calling all_inventory to load vars for managed_node3 30582 1726855367.40902: Calling groups_inventory to load vars for managed_node3 30582 1726855367.40906: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.40917: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.40920: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.40922: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.42275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.43272: done with get_vars() 30582 1726855367.43296: done getting variables 30582 1726855367.43340: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855367.43427: variable 'profile' from source: play vars 30582 1726855367.43431: variable 'interface' from source: play vars 30582 1726855367.43472: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 14:02:47 -0400 (0:00:00.039) 0:01:43.784 ****** 30582 1726855367.43499: entering _queue_task() for managed_node3/set_fact 30582 1726855367.43767: worker is 1 (out of 1 available) 30582 1726855367.43780: exiting _queue_task() for managed_node3/set_fact 30582 1726855367.43795: done queuing things up, now waiting for results queue to drain 30582 1726855367.43797: waiting for pending results... 30582 1726855367.43993: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30582 1726855367.44080: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f1e 30582 1726855367.44093: variable 'ansible_search_path' from source: unknown 30582 1726855367.44097: variable 'ansible_search_path' from source: unknown 30582 1726855367.44126: calling self._execute() 30582 1726855367.44203: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.44207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.44215: variable 'omit' from source: magic vars 30582 1726855367.44498: variable 'ansible_distribution_major_version' from source: facts 30582 1726855367.44508: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855367.44598: variable 'profile_stat' from source: set_fact 30582 1726855367.44607: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855367.44609: when evaluation is False, skipping this task 30582 1726855367.44612: _execute() done 30582 1726855367.44615: dumping result to json 30582 1726855367.44617: done dumping result, returning 30582 1726855367.44624: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000001f1e] 30582 1726855367.44629: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1e 30582 1726855367.44712: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1e 30582 1726855367.44715: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855367.44758: no more pending results, returning what we have 30582 1726855367.44761: results queue empty 30582 1726855367.44762: checking for any_errors_fatal 30582 1726855367.44769: done checking for any_errors_fatal 30582 1726855367.44769: checking for max_fail_percentage 30582 1726855367.44771: done checking for max_fail_percentage 30582 1726855367.44772: checking to see if all hosts have failed and the running result is not ok 30582 1726855367.44773: done checking to see if all hosts have failed 30582 1726855367.44773: getting the remaining hosts for this loop 30582 1726855367.44775: done getting the remaining hosts for this loop 30582 1726855367.44779: getting the next task for host managed_node3 30582 1726855367.44786: done getting next task for host managed_node3 30582 1726855367.44790: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30582 1726855367.44795: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855367.44799: getting variables 30582 1726855367.44800: in VariableManager get_vars() 30582 1726855367.44844: Calling all_inventory to load vars for managed_node3 30582 1726855367.44846: Calling groups_inventory to load vars for managed_node3 30582 1726855367.44849: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.44861: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.44864: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.44866: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.45676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.46538: done with get_vars() 30582 1726855367.46555: done getting variables 30582 1726855367.46605: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855367.46683: variable 'profile' from source: play vars 30582 1726855367.46686: variable 'interface' from source: play vars 30582 1726855367.46727: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 14:02:47 -0400 (0:00:00.032) 0:01:43.817 ****** 30582 1726855367.46752: entering _queue_task() for managed_node3/command 30582 1726855367.47003: worker is 1 (out of 1 available) 30582 1726855367.47016: exiting _queue_task() for managed_node3/command 30582 1726855367.47029: done queuing things up, now waiting for results queue to drain 30582 1726855367.47031: waiting for pending results... 30582 1726855367.47223: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30582 1726855367.47317: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f1f 30582 1726855367.47330: variable 'ansible_search_path' from source: unknown 30582 1726855367.47334: variable 'ansible_search_path' from source: unknown 30582 1726855367.47361: calling self._execute() 30582 1726855367.47434: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.47437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.47446: variable 'omit' from source: magic vars 30582 1726855367.47718: variable 'ansible_distribution_major_version' from source: facts 30582 1726855367.47726: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855367.47815: variable 'profile_stat' from source: set_fact 30582 1726855367.47825: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855367.47828: when evaluation is False, skipping this task 30582 1726855367.47832: _execute() done 30582 1726855367.47834: dumping result to json 30582 1726855367.47837: done dumping result, returning 30582 1726855367.47841: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000001f1f] 30582 1726855367.47846: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1f 30582 1726855367.47934: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f1f 30582 1726855367.47937: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855367.47986: no more pending results, returning what we have 30582 1726855367.47991: results queue empty 30582 1726855367.47992: checking for any_errors_fatal 30582 1726855367.47999: done checking for any_errors_fatal 30582 1726855367.48000: checking for max_fail_percentage 30582 1726855367.48002: done checking for max_fail_percentage 30582 1726855367.48003: checking to see if all hosts have failed and the running result is not ok 30582 1726855367.48003: done checking to see if all hosts have failed 30582 1726855367.48004: getting the remaining hosts for this loop 30582 1726855367.48005: done getting the remaining hosts for this loop 30582 1726855367.48009: getting the next task for host managed_node3 30582 1726855367.48017: done getting next task for host managed_node3 30582 1726855367.48019: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30582 1726855367.48023: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855367.48028: getting variables 30582 1726855367.48029: in VariableManager get_vars() 30582 1726855367.48069: Calling all_inventory to load vars for managed_node3 30582 1726855367.48072: Calling groups_inventory to load vars for managed_node3 30582 1726855367.48075: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.48086: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.48097: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.48101: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.53793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.55336: done with get_vars() 30582 1726855367.55371: done getting variables 30582 1726855367.55433: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855367.55538: variable 'profile' from source: play vars 30582 1726855367.55542: variable 'interface' from source: play vars 30582 1726855367.55605: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 14:02:47 -0400 (0:00:00.088) 0:01:43.906 ****** 30582 1726855367.55637: entering _queue_task() for managed_node3/set_fact 30582 1726855367.56021: worker is 1 (out of 1 available) 30582 1726855367.56035: exiting _queue_task() for managed_node3/set_fact 30582 1726855367.56048: done queuing things up, now waiting for results queue to drain 30582 1726855367.56050: waiting for pending results... 30582 1726855367.56510: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30582 1726855367.56547: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f20 30582 1726855367.56564: variable 'ansible_search_path' from source: unknown 30582 1726855367.56569: variable 'ansible_search_path' from source: unknown 30582 1726855367.56607: calling self._execute() 30582 1726855367.56717: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.56733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.56851: variable 'omit' from source: magic vars 30582 1726855367.57178: variable 'ansible_distribution_major_version' from source: facts 30582 1726855367.57193: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855367.57329: variable 'profile_stat' from source: set_fact 30582 1726855367.57342: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855367.57347: when evaluation is False, skipping this task 30582 1726855367.57349: _execute() done 30582 1726855367.57352: dumping result to json 30582 1726855367.57355: done dumping result, returning 30582 1726855367.57361: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000001f20] 30582 1726855367.57367: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f20 30582 1726855367.57471: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f20 30582 1726855367.57473: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855367.57540: no more pending results, returning what we have 30582 1726855367.57544: results queue empty 30582 1726855367.57545: checking for any_errors_fatal 30582 1726855367.57554: done checking for any_errors_fatal 30582 1726855367.57554: checking for max_fail_percentage 30582 1726855367.57557: done checking for max_fail_percentage 30582 1726855367.57558: checking to see if all hosts have failed and the running result is not ok 30582 1726855367.57558: done checking to see if all hosts have failed 30582 1726855367.57559: getting the remaining hosts for this loop 30582 1726855367.57560: done getting the remaining hosts for this loop 30582 1726855367.57564: getting the next task for host managed_node3 30582 1726855367.57573: done getting next task for host managed_node3 30582 1726855367.57575: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30582 1726855367.57579: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855367.57586: getting variables 30582 1726855367.57589: in VariableManager get_vars() 30582 1726855367.57631: Calling all_inventory to load vars for managed_node3 30582 1726855367.57634: Calling groups_inventory to load vars for managed_node3 30582 1726855367.57637: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.57649: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.57651: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.57654: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.58496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.59390: done with get_vars() 30582 1726855367.59407: done getting variables 30582 1726855367.59451: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855367.59536: variable 'profile' from source: play vars 30582 1726855367.59539: variable 'interface' from source: play vars 30582 1726855367.59581: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 14:02:47 -0400 (0:00:00.039) 0:01:43.945 ****** 30582 1726855367.59607: entering _queue_task() for managed_node3/assert 30582 1726855367.59848: worker is 1 (out of 1 available) 30582 1726855367.59863: exiting _queue_task() for managed_node3/assert 30582 1726855367.59876: done queuing things up, now waiting for results queue to drain 30582 1726855367.59877: waiting for pending results... 30582 1726855367.60062: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' 30582 1726855367.60145: in run() - task 0affcc66-ac2b-aa83-7d57-000000001e9a 30582 1726855367.60158: variable 'ansible_search_path' from source: unknown 30582 1726855367.60161: variable 'ansible_search_path' from source: unknown 30582 1726855367.60192: calling self._execute() 30582 1726855367.60267: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.60273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.60281: variable 'omit' from source: magic vars 30582 1726855367.60566: variable 'ansible_distribution_major_version' from source: facts 30582 1726855367.60579: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855367.60585: variable 'omit' from source: magic vars 30582 1726855367.60626: variable 'omit' from source: magic vars 30582 1726855367.60703: variable 'profile' from source: play vars 30582 1726855367.60707: variable 'interface' from source: play vars 30582 1726855367.60759: variable 'interface' from source: play vars 30582 1726855367.60774: variable 'omit' from source: magic vars 30582 1726855367.60809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855367.60836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855367.60853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855367.60872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855367.60882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855367.60907: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855367.60910: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.60913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.60989: Set connection var ansible_timeout to 10 30582 1726855367.60992: Set connection var ansible_connection to ssh 30582 1726855367.60998: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855367.61003: Set connection var ansible_pipelining to False 30582 1726855367.61008: Set connection var ansible_shell_executable to /bin/sh 30582 1726855367.61011: Set connection var ansible_shell_type to sh 30582 1726855367.61028: variable 'ansible_shell_executable' from source: unknown 30582 1726855367.61030: variable 'ansible_connection' from source: unknown 30582 1726855367.61033: variable 'ansible_module_compression' from source: unknown 30582 1726855367.61035: variable 'ansible_shell_type' from source: unknown 30582 1726855367.61037: variable 'ansible_shell_executable' from source: unknown 30582 1726855367.61039: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.61043: variable 'ansible_pipelining' from source: unknown 30582 1726855367.61046: variable 'ansible_timeout' from source: unknown 30582 1726855367.61050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.61154: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855367.61164: variable 'omit' from source: magic vars 30582 1726855367.61171: starting attempt loop 30582 1726855367.61174: running the handler 30582 1726855367.61261: variable 'lsr_net_profile_exists' from source: set_fact 30582 1726855367.61265: Evaluated conditional (not lsr_net_profile_exists): True 30582 1726855367.61273: handler run complete 30582 1726855367.61284: attempt loop complete, returning result 30582 1726855367.61288: _execute() done 30582 1726855367.61291: dumping result to json 30582 1726855367.61294: done dumping result, returning 30582 1726855367.61302: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' [0affcc66-ac2b-aa83-7d57-000000001e9a] 30582 1726855367.61311: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001e9a 30582 1726855367.61393: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001e9a 30582 1726855367.61396: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855367.61458: no more pending results, returning what we have 30582 1726855367.61462: results queue empty 30582 1726855367.61462: checking for any_errors_fatal 30582 1726855367.61470: done checking for any_errors_fatal 30582 1726855367.61471: checking for max_fail_percentage 30582 1726855367.61473: done checking for max_fail_percentage 30582 1726855367.61474: checking to see if all hosts have failed and the running result is not ok 30582 1726855367.61474: done checking to see if all hosts have failed 30582 1726855367.61475: getting the remaining hosts for this loop 30582 1726855367.61476: done getting the remaining hosts for this loop 30582 1726855367.61480: getting the next task for host managed_node3 30582 1726855367.61491: done getting next task for host managed_node3 30582 1726855367.61495: ^ task is: TASK: Conditional asserts 30582 1726855367.61498: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855367.61503: getting variables 30582 1726855367.61505: in VariableManager get_vars() 30582 1726855367.61549: Calling all_inventory to load vars for managed_node3 30582 1726855367.61551: Calling groups_inventory to load vars for managed_node3 30582 1726855367.61555: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.61565: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.61568: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.61570: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.62541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.63405: done with get_vars() 30582 1726855367.63423: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 14:02:47 -0400 (0:00:00.038) 0:01:43.984 ****** 30582 1726855367.63496: entering _queue_task() for managed_node3/include_tasks 30582 1726855367.63763: worker is 1 (out of 1 available) 30582 1726855367.63777: exiting _queue_task() for managed_node3/include_tasks 30582 1726855367.63792: done queuing things up, now waiting for results queue to drain 30582 1726855367.63794: waiting for pending results... 30582 1726855367.63995: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30582 1726855367.64075: in run() - task 0affcc66-ac2b-aa83-7d57-00000000174a 30582 1726855367.64089: variable 'ansible_search_path' from source: unknown 30582 1726855367.64092: variable 'ansible_search_path' from source: unknown 30582 1726855367.64310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855367.65873: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855367.65924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855367.65952: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855367.65983: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855367.66006: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855367.66076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855367.66102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855367.66120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855367.66145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855367.66156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855367.66253: variable 'lsr_assert_when' from source: include params 30582 1726855367.66341: variable 'network_provider' from source: set_fact 30582 1726855367.66400: variable 'omit' from source: magic vars 30582 1726855367.66489: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.66495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.66504: variable 'omit' from source: magic vars 30582 1726855367.66649: variable 'ansible_distribution_major_version' from source: facts 30582 1726855367.66657: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855367.66741: variable 'item' from source: unknown 30582 1726855367.66744: Evaluated conditional (item['condition']): True 30582 1726855367.66802: variable 'item' from source: unknown 30582 1726855367.66825: variable 'item' from source: unknown 30582 1726855367.66876: variable 'item' from source: unknown 30582 1726855367.67024: dumping result to json 30582 1726855367.67028: done dumping result, returning 30582 1726855367.67030: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcc66-ac2b-aa83-7d57-00000000174a] 30582 1726855367.67032: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000174a 30582 1726855367.67067: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000174a 30582 1726855367.67070: WORKER PROCESS EXITING 30582 1726855367.67092: no more pending results, returning what we have 30582 1726855367.67098: in VariableManager get_vars() 30582 1726855367.67143: Calling all_inventory to load vars for managed_node3 30582 1726855367.67146: Calling groups_inventory to load vars for managed_node3 30582 1726855367.67149: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.67161: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.67163: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.67166: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.68019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.68901: done with get_vars() 30582 1726855367.68921: variable 'ansible_search_path' from source: unknown 30582 1726855367.68922: variable 'ansible_search_path' from source: unknown 30582 1726855367.68953: we have included files to process 30582 1726855367.68953: generating all_blocks data 30582 1726855367.68955: done generating all_blocks data 30582 1726855367.68959: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855367.68959: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855367.68961: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855367.69040: in VariableManager get_vars() 30582 1726855367.69058: done with get_vars() 30582 1726855367.69140: done processing included file 30582 1726855367.69141: iterating over new_blocks loaded from include file 30582 1726855367.69142: in VariableManager get_vars() 30582 1726855367.69153: done with get_vars() 30582 1726855367.69155: filtering new block on tags 30582 1726855367.69179: done filtering new block on tags 30582 1726855367.69181: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30582 1726855367.69185: extending task lists for all hosts with included blocks 30582 1726855367.69865: done extending task lists 30582 1726855367.69867: done processing included files 30582 1726855367.69867: results queue empty 30582 1726855367.69868: checking for any_errors_fatal 30582 1726855367.69871: done checking for any_errors_fatal 30582 1726855367.69871: checking for max_fail_percentage 30582 1726855367.69872: done checking for max_fail_percentage 30582 1726855367.69873: checking to see if all hosts have failed and the running result is not ok 30582 1726855367.69873: done checking to see if all hosts have failed 30582 1726855367.69874: getting the remaining hosts for this loop 30582 1726855367.69875: done getting the remaining hosts for this loop 30582 1726855367.69876: getting the next task for host managed_node3 30582 1726855367.69879: done getting next task for host managed_node3 30582 1726855367.69881: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30582 1726855367.69882: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855367.69891: getting variables 30582 1726855367.69893: in VariableManager get_vars() 30582 1726855367.69901: Calling all_inventory to load vars for managed_node3 30582 1726855367.69903: Calling groups_inventory to load vars for managed_node3 30582 1726855367.69904: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.69909: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.69910: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.69912: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.70674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.71532: done with get_vars() 30582 1726855367.71547: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 14:02:47 -0400 (0:00:00.081) 0:01:44.065 ****** 30582 1726855367.71603: entering _queue_task() for managed_node3/include_tasks 30582 1726855367.71920: worker is 1 (out of 1 available) 30582 1726855367.71933: exiting _queue_task() for managed_node3/include_tasks 30582 1726855367.71946: done queuing things up, now waiting for results queue to drain 30582 1726855367.71947: waiting for pending results... 30582 1726855367.72143: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30582 1726855367.72228: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f59 30582 1726855367.72239: variable 'ansible_search_path' from source: unknown 30582 1726855367.72243: variable 'ansible_search_path' from source: unknown 30582 1726855367.72275: calling self._execute() 30582 1726855367.72349: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.72352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.72361: variable 'omit' from source: magic vars 30582 1726855367.72649: variable 'ansible_distribution_major_version' from source: facts 30582 1726855367.72659: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855367.72665: _execute() done 30582 1726855367.72670: dumping result to json 30582 1726855367.72673: done dumping result, returning 30582 1726855367.72680: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-aa83-7d57-000000001f59] 30582 1726855367.72685: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f59 30582 1726855367.72774: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f59 30582 1726855367.72777: WORKER PROCESS EXITING 30582 1726855367.72805: no more pending results, returning what we have 30582 1726855367.72811: in VariableManager get_vars() 30582 1726855367.72857: Calling all_inventory to load vars for managed_node3 30582 1726855367.72860: Calling groups_inventory to load vars for managed_node3 30582 1726855367.72863: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.72877: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.72881: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.72883: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.73929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.75301: done with get_vars() 30582 1726855367.75320: variable 'ansible_search_path' from source: unknown 30582 1726855367.75321: variable 'ansible_search_path' from source: unknown 30582 1726855367.75429: variable 'item' from source: include params 30582 1726855367.75455: we have included files to process 30582 1726855367.75456: generating all_blocks data 30582 1726855367.75457: done generating all_blocks data 30582 1726855367.75458: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855367.75459: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855367.75461: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855367.75592: done processing included file 30582 1726855367.75594: iterating over new_blocks loaded from include file 30582 1726855367.75595: in VariableManager get_vars() 30582 1726855367.75610: done with get_vars() 30582 1726855367.75611: filtering new block on tags 30582 1726855367.75629: done filtering new block on tags 30582 1726855367.75630: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30582 1726855367.75634: extending task lists for all hosts with included blocks 30582 1726855367.75734: done extending task lists 30582 1726855367.75735: done processing included files 30582 1726855367.75735: results queue empty 30582 1726855367.75736: checking for any_errors_fatal 30582 1726855367.75738: done checking for any_errors_fatal 30582 1726855367.75739: checking for max_fail_percentage 30582 1726855367.75740: done checking for max_fail_percentage 30582 1726855367.75740: checking to see if all hosts have failed and the running result is not ok 30582 1726855367.75741: done checking to see if all hosts have failed 30582 1726855367.75741: getting the remaining hosts for this loop 30582 1726855367.75742: done getting the remaining hosts for this loop 30582 1726855367.75744: getting the next task for host managed_node3 30582 1726855367.75747: done getting next task for host managed_node3 30582 1726855367.75749: ^ task is: TASK: Get stat for interface {{ interface }} 30582 1726855367.75751: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855367.75753: getting variables 30582 1726855367.75753: in VariableManager get_vars() 30582 1726855367.75761: Calling all_inventory to load vars for managed_node3 30582 1726855367.75763: Calling groups_inventory to load vars for managed_node3 30582 1726855367.75766: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855367.75770: Calling all_plugins_play to load vars for managed_node3 30582 1726855367.75772: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855367.75773: Calling groups_plugins_play to load vars for managed_node3 30582 1726855367.76444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855367.77662: done with get_vars() 30582 1726855367.77690: done getting variables 30582 1726855367.77807: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 14:02:47 -0400 (0:00:00.062) 0:01:44.128 ****** 30582 1726855367.77836: entering _queue_task() for managed_node3/stat 30582 1726855367.78203: worker is 1 (out of 1 available) 30582 1726855367.78217: exiting _queue_task() for managed_node3/stat 30582 1726855367.78230: done queuing things up, now waiting for results queue to drain 30582 1726855367.78232: waiting for pending results... 30582 1726855367.78616: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30582 1726855367.78696: in run() - task 0affcc66-ac2b-aa83-7d57-000000001fe8 30582 1726855367.78701: variable 'ansible_search_path' from source: unknown 30582 1726855367.78707: variable 'ansible_search_path' from source: unknown 30582 1726855367.78730: calling self._execute() 30582 1726855367.78831: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.78845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.78892: variable 'omit' from source: magic vars 30582 1726855367.79261: variable 'ansible_distribution_major_version' from source: facts 30582 1726855367.79281: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855367.79295: variable 'omit' from source: magic vars 30582 1726855367.79348: variable 'omit' from source: magic vars 30582 1726855367.79458: variable 'interface' from source: play vars 30582 1726855367.79582: variable 'omit' from source: magic vars 30582 1726855367.79586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855367.79590: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855367.79598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855367.79620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855367.79635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855367.79670: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855367.79679: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.79692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.79802: Set connection var ansible_timeout to 10 30582 1726855367.79809: Set connection var ansible_connection to ssh 30582 1726855367.79821: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855367.79830: Set connection var ansible_pipelining to False 30582 1726855367.79838: Set connection var ansible_shell_executable to /bin/sh 30582 1726855367.79843: Set connection var ansible_shell_type to sh 30582 1726855367.79871: variable 'ansible_shell_executable' from source: unknown 30582 1726855367.79879: variable 'ansible_connection' from source: unknown 30582 1726855367.79885: variable 'ansible_module_compression' from source: unknown 30582 1726855367.79894: variable 'ansible_shell_type' from source: unknown 30582 1726855367.79904: variable 'ansible_shell_executable' from source: unknown 30582 1726855367.79910: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855367.79916: variable 'ansible_pipelining' from source: unknown 30582 1726855367.79923: variable 'ansible_timeout' from source: unknown 30582 1726855367.79929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855367.80226: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855367.80230: variable 'omit' from source: magic vars 30582 1726855367.80233: starting attempt loop 30582 1726855367.80235: running the handler 30582 1726855367.80237: _low_level_execute_command(): starting 30582 1726855367.80239: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855367.80976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855367.81002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855367.81107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.81125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855367.81144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855367.81167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.81267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.82998: stdout chunk (state=3): >>>/root <<< 30582 1726855367.83145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855367.83161: stdout chunk (state=3): >>><<< 30582 1726855367.83183: stderr chunk (state=3): >>><<< 30582 1726855367.83219: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855367.83243: _low_level_execute_command(): starting 30582 1726855367.83339: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453 `" && echo ansible-tmp-1726855367.8322732-35457-172586648935453="` echo /root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453 `" ) && sleep 0' 30582 1726855367.83957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855367.83974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855367.84001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855367.84113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855367.84142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855367.84163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.84268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.86167: stdout chunk (state=3): >>>ansible-tmp-1726855367.8322732-35457-172586648935453=/root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453 <<< 30582 1726855367.86280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855367.86310: stderr chunk (state=3): >>><<< 30582 1726855367.86313: stdout chunk (state=3): >>><<< 30582 1726855367.86329: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855367.8322732-35457-172586648935453=/root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855367.86375: variable 'ansible_module_compression' from source: unknown 30582 1726855367.86424: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855367.86455: variable 'ansible_facts' from source: unknown 30582 1726855367.86520: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/AnsiballZ_stat.py 30582 1726855367.86624: Sending initial data 30582 1726855367.86627: Sent initial data (153 bytes) 30582 1726855367.87058: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855367.87092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855367.87096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855367.87100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855367.87102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855367.87104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.87162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855367.87165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.87236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.88782: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855367.88840: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855367.88899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpdwazghnt /root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/AnsiballZ_stat.py <<< 30582 1726855367.88903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/AnsiballZ_stat.py" <<< 30582 1726855367.88959: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpdwazghnt" to remote "/root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/AnsiballZ_stat.py" <<< 30582 1726855367.89558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855367.89604: stderr chunk (state=3): >>><<< 30582 1726855367.89607: stdout chunk (state=3): >>><<< 30582 1726855367.89625: done transferring module to remote 30582 1726855367.89634: _low_level_execute_command(): starting 30582 1726855367.89639: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/ /root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/AnsiballZ_stat.py && sleep 0' 30582 1726855367.90211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855367.90273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.90318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855367.92099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855367.92123: stderr chunk (state=3): >>><<< 30582 1726855367.92126: stdout chunk (state=3): >>><<< 30582 1726855367.92140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855367.92143: _low_level_execute_command(): starting 30582 1726855367.92148: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/AnsiballZ_stat.py && sleep 0' 30582 1726855367.92571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855367.92578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855367.92595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.92614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855367.92617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855367.92681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855367.92689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855367.92692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855367.92763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855368.07898: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855368.09209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855368.09244: stderr chunk (state=3): >>><<< 30582 1726855368.09247: stdout chunk (state=3): >>><<< 30582 1726855368.09260: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855368.09286: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855368.09297: _low_level_execute_command(): starting 30582 1726855368.09302: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855367.8322732-35457-172586648935453/ > /dev/null 2>&1 && sleep 0' 30582 1726855368.09757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855368.09760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855368.09762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855368.09764: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855368.09766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855368.09823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855368.09827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855368.09836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855368.09892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855368.11734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855368.11758: stderr chunk (state=3): >>><<< 30582 1726855368.11762: stdout chunk (state=3): >>><<< 30582 1726855368.11778: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855368.11783: handler run complete 30582 1726855368.11803: attempt loop complete, returning result 30582 1726855368.11806: _execute() done 30582 1726855368.11808: dumping result to json 30582 1726855368.11810: done dumping result, returning 30582 1726855368.11819: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcc66-ac2b-aa83-7d57-000000001fe8] 30582 1726855368.11823: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001fe8 30582 1726855368.11924: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001fe8 30582 1726855368.11927: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855368.12011: no more pending results, returning what we have 30582 1726855368.12016: results queue empty 30582 1726855368.12017: checking for any_errors_fatal 30582 1726855368.12018: done checking for any_errors_fatal 30582 1726855368.12019: checking for max_fail_percentage 30582 1726855368.12021: done checking for max_fail_percentage 30582 1726855368.12022: checking to see if all hosts have failed and the running result is not ok 30582 1726855368.12023: done checking to see if all hosts have failed 30582 1726855368.12024: getting the remaining hosts for this loop 30582 1726855368.12025: done getting the remaining hosts for this loop 30582 1726855368.12029: getting the next task for host managed_node3 30582 1726855368.12038: done getting next task for host managed_node3 30582 1726855368.12040: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30582 1726855368.12045: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855368.12051: getting variables 30582 1726855368.12052: in VariableManager get_vars() 30582 1726855368.12097: Calling all_inventory to load vars for managed_node3 30582 1726855368.12100: Calling groups_inventory to load vars for managed_node3 30582 1726855368.12103: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.12114: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.12117: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.12119: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.13051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.13934: done with get_vars() 30582 1726855368.13952: done getting variables 30582 1726855368.14002: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855368.14094: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 14:02:48 -0400 (0:00:00.362) 0:01:44.491 ****** 30582 1726855368.14118: entering _queue_task() for managed_node3/assert 30582 1726855368.14394: worker is 1 (out of 1 available) 30582 1726855368.14408: exiting _queue_task() for managed_node3/assert 30582 1726855368.14422: done queuing things up, now waiting for results queue to drain 30582 1726855368.14423: waiting for pending results... 30582 1726855368.14612: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30582 1726855368.14702: in run() - task 0affcc66-ac2b-aa83-7d57-000000001f5a 30582 1726855368.14714: variable 'ansible_search_path' from source: unknown 30582 1726855368.14717: variable 'ansible_search_path' from source: unknown 30582 1726855368.14751: calling self._execute() 30582 1726855368.14824: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.14828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.14836: variable 'omit' from source: magic vars 30582 1726855368.15112: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.15123: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.15129: variable 'omit' from source: magic vars 30582 1726855368.15161: variable 'omit' from source: magic vars 30582 1726855368.15236: variable 'interface' from source: play vars 30582 1726855368.15250: variable 'omit' from source: magic vars 30582 1726855368.15282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855368.15313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.15332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855368.15345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.15355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.15381: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.15385: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.15389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.15462: Set connection var ansible_timeout to 10 30582 1726855368.15468: Set connection var ansible_connection to ssh 30582 1726855368.15471: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.15476: Set connection var ansible_pipelining to False 30582 1726855368.15481: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.15484: Set connection var ansible_shell_type to sh 30582 1726855368.15501: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.15505: variable 'ansible_connection' from source: unknown 30582 1726855368.15507: variable 'ansible_module_compression' from source: unknown 30582 1726855368.15509: variable 'ansible_shell_type' from source: unknown 30582 1726855368.15513: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.15516: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.15518: variable 'ansible_pipelining' from source: unknown 30582 1726855368.15521: variable 'ansible_timeout' from source: unknown 30582 1726855368.15524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.15625: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.15640: variable 'omit' from source: magic vars 30582 1726855368.15647: starting attempt loop 30582 1726855368.15649: running the handler 30582 1726855368.15741: variable 'interface_stat' from source: set_fact 30582 1726855368.15748: Evaluated conditional (not interface_stat.stat.exists): True 30582 1726855368.15760: handler run complete 30582 1726855368.15775: attempt loop complete, returning result 30582 1726855368.15778: _execute() done 30582 1726855368.15780: dumping result to json 30582 1726855368.15783: done dumping result, returning 30582 1726855368.15791: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcc66-ac2b-aa83-7d57-000000001f5a] 30582 1726855368.15796: sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f5a 30582 1726855368.15884: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000001f5a 30582 1726855368.15888: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855368.15933: no more pending results, returning what we have 30582 1726855368.15936: results queue empty 30582 1726855368.15937: checking for any_errors_fatal 30582 1726855368.15946: done checking for any_errors_fatal 30582 1726855368.15947: checking for max_fail_percentage 30582 1726855368.15949: done checking for max_fail_percentage 30582 1726855368.15950: checking to see if all hosts have failed and the running result is not ok 30582 1726855368.15951: done checking to see if all hosts have failed 30582 1726855368.15951: getting the remaining hosts for this loop 30582 1726855368.15953: done getting the remaining hosts for this loop 30582 1726855368.15956: getting the next task for host managed_node3 30582 1726855368.15965: done getting next task for host managed_node3 30582 1726855368.15968: ^ task is: TASK: Success in test '{{ lsr_description }}' 30582 1726855368.15970: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855368.15976: getting variables 30582 1726855368.15977: in VariableManager get_vars() 30582 1726855368.16021: Calling all_inventory to load vars for managed_node3 30582 1726855368.16023: Calling groups_inventory to load vars for managed_node3 30582 1726855368.16026: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.16037: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.16040: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.16042: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.16876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.17755: done with get_vars() 30582 1726855368.17775: done getting variables 30582 1726855368.17821: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855368.17912: variable 'lsr_description' from source: include params TASK [Success in test 'I can take a profile down that is absent'] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 14:02:48 -0400 (0:00:00.038) 0:01:44.529 ****** 30582 1726855368.17936: entering _queue_task() for managed_node3/debug 30582 1726855368.18211: worker is 1 (out of 1 available) 30582 1726855368.18226: exiting _queue_task() for managed_node3/debug 30582 1726855368.18238: done queuing things up, now waiting for results queue to drain 30582 1726855368.18240: waiting for pending results... 30582 1726855368.18437: running TaskExecutor() for managed_node3/TASK: Success in test 'I can take a profile down that is absent' 30582 1726855368.18520: in run() - task 0affcc66-ac2b-aa83-7d57-00000000174b 30582 1726855368.18532: variable 'ansible_search_path' from source: unknown 30582 1726855368.18536: variable 'ansible_search_path' from source: unknown 30582 1726855368.18564: calling self._execute() 30582 1726855368.18638: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.18642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.18650: variable 'omit' from source: magic vars 30582 1726855368.18930: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.18939: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.18946: variable 'omit' from source: magic vars 30582 1726855368.18973: variable 'omit' from source: magic vars 30582 1726855368.19045: variable 'lsr_description' from source: include params 30582 1726855368.19060: variable 'omit' from source: magic vars 30582 1726855368.19097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855368.19126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.19144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855368.19158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.19170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.19196: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.19199: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.19202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.19278: Set connection var ansible_timeout to 10 30582 1726855368.19281: Set connection var ansible_connection to ssh 30582 1726855368.19286: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.19292: Set connection var ansible_pipelining to False 30582 1726855368.19297: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.19300: Set connection var ansible_shell_type to sh 30582 1726855368.19317: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.19320: variable 'ansible_connection' from source: unknown 30582 1726855368.19322: variable 'ansible_module_compression' from source: unknown 30582 1726855368.19325: variable 'ansible_shell_type' from source: unknown 30582 1726855368.19327: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.19329: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.19333: variable 'ansible_pipelining' from source: unknown 30582 1726855368.19336: variable 'ansible_timeout' from source: unknown 30582 1726855368.19346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.19447: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.19459: variable 'omit' from source: magic vars 30582 1726855368.19462: starting attempt loop 30582 1726855368.19465: running the handler 30582 1726855368.19504: handler run complete 30582 1726855368.19514: attempt loop complete, returning result 30582 1726855368.19517: _execute() done 30582 1726855368.19520: dumping result to json 30582 1726855368.19523: done dumping result, returning 30582 1726855368.19529: done running TaskExecutor() for managed_node3/TASK: Success in test 'I can take a profile down that is absent' [0affcc66-ac2b-aa83-7d57-00000000174b] 30582 1726855368.19535: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000174b 30582 1726855368.19617: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000174b 30582 1726855368.19619: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I can take a profile down that is absent' +++++ 30582 1726855368.19669: no more pending results, returning what we have 30582 1726855368.19673: results queue empty 30582 1726855368.19674: checking for any_errors_fatal 30582 1726855368.19680: done checking for any_errors_fatal 30582 1726855368.19681: checking for max_fail_percentage 30582 1726855368.19682: done checking for max_fail_percentage 30582 1726855368.19683: checking to see if all hosts have failed and the running result is not ok 30582 1726855368.19684: done checking to see if all hosts have failed 30582 1726855368.19685: getting the remaining hosts for this loop 30582 1726855368.19686: done getting the remaining hosts for this loop 30582 1726855368.19692: getting the next task for host managed_node3 30582 1726855368.19698: done getting next task for host managed_node3 30582 1726855368.19702: ^ task is: TASK: Cleanup 30582 1726855368.19704: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855368.19710: getting variables 30582 1726855368.19711: in VariableManager get_vars() 30582 1726855368.19755: Calling all_inventory to load vars for managed_node3 30582 1726855368.19757: Calling groups_inventory to load vars for managed_node3 30582 1726855368.19761: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.19771: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.19774: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.19776: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.20708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.21571: done with get_vars() 30582 1726855368.21586: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 14:02:48 -0400 (0:00:00.037) 0:01:44.566 ****** 30582 1726855368.21656: entering _queue_task() for managed_node3/include_tasks 30582 1726855368.21912: worker is 1 (out of 1 available) 30582 1726855368.21927: exiting _queue_task() for managed_node3/include_tasks 30582 1726855368.21939: done queuing things up, now waiting for results queue to drain 30582 1726855368.21941: waiting for pending results... 30582 1726855368.22132: running TaskExecutor() for managed_node3/TASK: Cleanup 30582 1726855368.22208: in run() - task 0affcc66-ac2b-aa83-7d57-00000000174f 30582 1726855368.22219: variable 'ansible_search_path' from source: unknown 30582 1726855368.22222: variable 'ansible_search_path' from source: unknown 30582 1726855368.22260: variable 'lsr_cleanup' from source: include params 30582 1726855368.22421: variable 'lsr_cleanup' from source: include params 30582 1726855368.22476: variable 'omit' from source: magic vars 30582 1726855368.22579: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.22584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.22593: variable 'omit' from source: magic vars 30582 1726855368.22766: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.22776: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.22782: variable 'item' from source: unknown 30582 1726855368.22833: variable 'item' from source: unknown 30582 1726855368.22854: variable 'item' from source: unknown 30582 1726855368.22901: variable 'item' from source: unknown 30582 1726855368.23024: dumping result to json 30582 1726855368.23027: done dumping result, returning 30582 1726855368.23029: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcc66-ac2b-aa83-7d57-00000000174f] 30582 1726855368.23030: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000174f 30582 1726855368.23066: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000174f 30582 1726855368.23069: WORKER PROCESS EXITING 30582 1726855368.23089: no more pending results, returning what we have 30582 1726855368.23094: in VariableManager get_vars() 30582 1726855368.23140: Calling all_inventory to load vars for managed_node3 30582 1726855368.23143: Calling groups_inventory to load vars for managed_node3 30582 1726855368.23146: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.23157: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.23160: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.23163: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.23995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.24991: done with get_vars() 30582 1726855368.25007: variable 'ansible_search_path' from source: unknown 30582 1726855368.25008: variable 'ansible_search_path' from source: unknown 30582 1726855368.25040: we have included files to process 30582 1726855368.25041: generating all_blocks data 30582 1726855368.25042: done generating all_blocks data 30582 1726855368.25045: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855368.25046: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855368.25048: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855368.25189: done processing included file 30582 1726855368.25190: iterating over new_blocks loaded from include file 30582 1726855368.25192: in VariableManager get_vars() 30582 1726855368.25204: done with get_vars() 30582 1726855368.25205: filtering new block on tags 30582 1726855368.25222: done filtering new block on tags 30582 1726855368.25223: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30582 1726855368.25227: extending task lists for all hosts with included blocks 30582 1726855368.25969: done extending task lists 30582 1726855368.25971: done processing included files 30582 1726855368.25971: results queue empty 30582 1726855368.25972: checking for any_errors_fatal 30582 1726855368.25974: done checking for any_errors_fatal 30582 1726855368.25975: checking for max_fail_percentage 30582 1726855368.25975: done checking for max_fail_percentage 30582 1726855368.25976: checking to see if all hosts have failed and the running result is not ok 30582 1726855368.25977: done checking to see if all hosts have failed 30582 1726855368.25977: getting the remaining hosts for this loop 30582 1726855368.25978: done getting the remaining hosts for this loop 30582 1726855368.25980: getting the next task for host managed_node3 30582 1726855368.25983: done getting next task for host managed_node3 30582 1726855368.25984: ^ task is: TASK: Cleanup profile and device 30582 1726855368.25986: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855368.25990: getting variables 30582 1726855368.25991: in VariableManager get_vars() 30582 1726855368.26002: Calling all_inventory to load vars for managed_node3 30582 1726855368.26004: Calling groups_inventory to load vars for managed_node3 30582 1726855368.26006: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.26011: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.26012: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.26014: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.26760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.27628: done with get_vars() 30582 1726855368.27653: done getting variables 30582 1726855368.27689: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 14:02:48 -0400 (0:00:00.060) 0:01:44.627 ****** 30582 1726855368.27714: entering _queue_task() for managed_node3/shell 30582 1726855368.27995: worker is 1 (out of 1 available) 30582 1726855368.28009: exiting _queue_task() for managed_node3/shell 30582 1726855368.28022: done queuing things up, now waiting for results queue to drain 30582 1726855368.28024: waiting for pending results... 30582 1726855368.28223: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30582 1726855368.28302: in run() - task 0affcc66-ac2b-aa83-7d57-00000000200b 30582 1726855368.28314: variable 'ansible_search_path' from source: unknown 30582 1726855368.28317: variable 'ansible_search_path' from source: unknown 30582 1726855368.28345: calling self._execute() 30582 1726855368.28425: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.28429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.28438: variable 'omit' from source: magic vars 30582 1726855368.28993: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.28997: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.28999: variable 'omit' from source: magic vars 30582 1726855368.29003: variable 'omit' from source: magic vars 30582 1726855368.29026: variable 'interface' from source: play vars 30582 1726855368.29055: variable 'omit' from source: magic vars 30582 1726855368.29105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855368.29148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.29179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855368.29205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.29224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.29261: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.29273: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.29281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.29391: Set connection var ansible_timeout to 10 30582 1726855368.29399: Set connection var ansible_connection to ssh 30582 1726855368.29410: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.29418: Set connection var ansible_pipelining to False 30582 1726855368.29426: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.29433: Set connection var ansible_shell_type to sh 30582 1726855368.29461: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.29473: variable 'ansible_connection' from source: unknown 30582 1726855368.29479: variable 'ansible_module_compression' from source: unknown 30582 1726855368.29485: variable 'ansible_shell_type' from source: unknown 30582 1726855368.29494: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.29500: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.29507: variable 'ansible_pipelining' from source: unknown 30582 1726855368.29513: variable 'ansible_timeout' from source: unknown 30582 1726855368.29519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.29662: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.29684: variable 'omit' from source: magic vars 30582 1726855368.29697: starting attempt loop 30582 1726855368.29705: running the handler 30582 1726855368.29722: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.29749: _low_level_execute_command(): starting 30582 1726855368.29761: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855368.30322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855368.30333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855368.30340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855368.30346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855368.30362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855368.30378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855368.30433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855368.30436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855368.30438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855368.30511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855368.32456: stdout chunk (state=3): >>>/root <<< 30582 1726855368.32460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855368.32462: stdout chunk (state=3): >>><<< 30582 1726855368.32464: stderr chunk (state=3): >>><<< 30582 1726855368.32485: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855368.32512: _low_level_execute_command(): starting 30582 1726855368.32603: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088 `" && echo ansible-tmp-1726855368.3249736-35475-199603228137088="` echo /root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088 `" ) && sleep 0' 30582 1726855368.33139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855368.33153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855368.33166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855368.33184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855368.33204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855368.33215: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855368.33228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855368.33312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855368.33338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855368.33372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855368.33463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855368.35379: stdout chunk (state=3): >>>ansible-tmp-1726855368.3249736-35475-199603228137088=/root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088 <<< 30582 1726855368.35516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855368.35535: stdout chunk (state=3): >>><<< 30582 1726855368.35547: stderr chunk (state=3): >>><<< 30582 1726855368.35574: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855368.3249736-35475-199603228137088=/root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855368.35618: variable 'ansible_module_compression' from source: unknown 30582 1726855368.35691: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855368.35737: variable 'ansible_facts' from source: unknown 30582 1726855368.35937: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/AnsiballZ_command.py 30582 1726855368.36010: Sending initial data 30582 1726855368.36020: Sent initial data (156 bytes) 30582 1726855368.36600: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855368.36616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855368.36703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855368.36730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855368.36744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855368.36764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855368.36968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855368.38513: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855368.38606: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855368.38672: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp92k97d1l /root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/AnsiballZ_command.py <<< 30582 1726855368.38683: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/AnsiballZ_command.py" <<< 30582 1726855368.38719: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp92k97d1l" to remote "/root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/AnsiballZ_command.py" <<< 30582 1726855368.39608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855368.39649: stderr chunk (state=3): >>><<< 30582 1726855368.39658: stdout chunk (state=3): >>><<< 30582 1726855368.39782: done transferring module to remote 30582 1726855368.39785: _low_level_execute_command(): starting 30582 1726855368.39794: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/ /root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/AnsiballZ_command.py && sleep 0' 30582 1726855368.40368: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855368.40382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855368.40401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855368.40417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855368.40431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855368.40441: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855368.40535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855368.40556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855368.40577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855368.40729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855368.42508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855368.42586: stderr chunk (state=3): >>><<< 30582 1726855368.42598: stdout chunk (state=3): >>><<< 30582 1726855368.42617: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855368.42625: _low_level_execute_command(): starting 30582 1726855368.42675: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/AnsiballZ_command.py && sleep 0' 30582 1726855368.43269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855368.43283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855368.43299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855368.43338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855368.43353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855368.43401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855368.43447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855368.43481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855368.43522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855368.43601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855368.62222: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:02:48.586564", "end": "2024-09-20 14:02:48.619808", "delta": "0:00:00.033244", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855368.63585: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 30582 1726855368.63603: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855368.63667: stderr chunk (state=3): >>><<< 30582 1726855368.63684: stdout chunk (state=3): >>><<< 30582 1726855368.63712: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:02:48.586564", "end": "2024-09-20 14:02:48.619808", "delta": "0:00:00.033244", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855368.63767: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855368.63783: _low_level_execute_command(): starting 30582 1726855368.63802: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855368.3249736-35475-199603228137088/ > /dev/null 2>&1 && sleep 0' 30582 1726855368.64436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855368.64495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855368.64510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855368.64582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855368.64605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855368.64631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855368.64721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855368.66702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855368.66706: stdout chunk (state=3): >>><<< 30582 1726855368.66709: stderr chunk (state=3): >>><<< 30582 1726855368.66711: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855368.66713: handler run complete 30582 1726855368.66722: Evaluated conditional (False): False 30582 1726855368.66738: attempt loop complete, returning result 30582 1726855368.66744: _execute() done 30582 1726855368.66750: dumping result to json 30582 1726855368.66758: done dumping result, returning 30582 1726855368.66809: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcc66-ac2b-aa83-7d57-00000000200b] 30582 1726855368.66812: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000200b fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.033244", "end": "2024-09-20 14:02:48.619808", "rc": 1, "start": "2024-09-20 14:02:48.586564" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30582 1726855368.67158: no more pending results, returning what we have 30582 1726855368.67162: results queue empty 30582 1726855368.67163: checking for any_errors_fatal 30582 1726855368.67167: done checking for any_errors_fatal 30582 1726855368.67168: checking for max_fail_percentage 30582 1726855368.67170: done checking for max_fail_percentage 30582 1726855368.67171: checking to see if all hosts have failed and the running result is not ok 30582 1726855368.67172: done checking to see if all hosts have failed 30582 1726855368.67173: getting the remaining hosts for this loop 30582 1726855368.67174: done getting the remaining hosts for this loop 30582 1726855368.67178: getting the next task for host managed_node3 30582 1726855368.67198: done getting next task for host managed_node3 30582 1726855368.67201: ^ task is: TASK: Include the task 'run_test.yml' 30582 1726855368.67204: ^ state is: HOST STATE: block=8, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855368.67208: getting variables 30582 1726855368.67210: in VariableManager get_vars() 30582 1726855368.67254: Calling all_inventory to load vars for managed_node3 30582 1726855368.67257: Calling groups_inventory to load vars for managed_node3 30582 1726855368.67261: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.67276: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.67280: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.67284: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.67427: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000200b 30582 1726855368.67431: WORKER PROCESS EXITING 30582 1726855368.69170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.70920: done with get_vars() 30582 1726855368.70955: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:124 Friday 20 September 2024 14:02:48 -0400 (0:00:00.433) 0:01:45.060 ****** 30582 1726855368.71075: entering _queue_task() for managed_node3/include_tasks 30582 1726855368.71512: worker is 1 (out of 1 available) 30582 1726855368.71524: exiting _queue_task() for managed_node3/include_tasks 30582 1726855368.71537: done queuing things up, now waiting for results queue to drain 30582 1726855368.71538: waiting for pending results... 30582 1726855368.71857: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 30582 1726855368.71974: in run() - task 0affcc66-ac2b-aa83-7d57-000000000017 30582 1726855368.71999: variable 'ansible_search_path' from source: unknown 30582 1726855368.72048: calling self._execute() 30582 1726855368.72161: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.72175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.72192: variable 'omit' from source: magic vars 30582 1726855368.72624: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.72642: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.72653: _execute() done 30582 1726855368.72673: dumping result to json 30582 1726855368.72683: done dumping result, returning 30582 1726855368.72696: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcc66-ac2b-aa83-7d57-000000000017] 30582 1726855368.72706: sending task result for task 0affcc66-ac2b-aa83-7d57-000000000017 30582 1726855368.72880: no more pending results, returning what we have 30582 1726855368.72885: in VariableManager get_vars() 30582 1726855368.72947: Calling all_inventory to load vars for managed_node3 30582 1726855368.72951: Calling groups_inventory to load vars for managed_node3 30582 1726855368.72955: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.72972: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.72976: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.72980: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.73808: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000000017 30582 1726855368.73812: WORKER PROCESS EXITING 30582 1726855368.74756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.76348: done with get_vars() 30582 1726855368.76377: variable 'ansible_search_path' from source: unknown 30582 1726855368.76396: we have included files to process 30582 1726855368.76397: generating all_blocks data 30582 1726855368.76399: done generating all_blocks data 30582 1726855368.76405: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855368.76406: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855368.76409: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30582 1726855368.76814: in VariableManager get_vars() 30582 1726855368.76834: done with get_vars() 30582 1726855368.76877: in VariableManager get_vars() 30582 1726855368.76899: done with get_vars() 30582 1726855368.76939: in VariableManager get_vars() 30582 1726855368.76956: done with get_vars() 30582 1726855368.77000: in VariableManager get_vars() 30582 1726855368.77018: done with get_vars() 30582 1726855368.77057: in VariableManager get_vars() 30582 1726855368.77077: done with get_vars() 30582 1726855368.77602: in VariableManager get_vars() 30582 1726855368.77619: done with get_vars() 30582 1726855368.77631: done processing included file 30582 1726855368.77633: iterating over new_blocks loaded from include file 30582 1726855368.77634: in VariableManager get_vars() 30582 1726855368.77646: done with get_vars() 30582 1726855368.77647: filtering new block on tags 30582 1726855368.77749: done filtering new block on tags 30582 1726855368.77753: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 30582 1726855368.77758: extending task lists for all hosts with included blocks 30582 1726855368.77798: done extending task lists 30582 1726855368.77799: done processing included files 30582 1726855368.77800: results queue empty 30582 1726855368.77801: checking for any_errors_fatal 30582 1726855368.77805: done checking for any_errors_fatal 30582 1726855368.77806: checking for max_fail_percentage 30582 1726855368.77807: done checking for max_fail_percentage 30582 1726855368.77808: checking to see if all hosts have failed and the running result is not ok 30582 1726855368.77809: done checking to see if all hosts have failed 30582 1726855368.77809: getting the remaining hosts for this loop 30582 1726855368.77811: done getting the remaining hosts for this loop 30582 1726855368.77813: getting the next task for host managed_node3 30582 1726855368.77817: done getting next task for host managed_node3 30582 1726855368.77819: ^ task is: TASK: TEST: {{ lsr_description }} 30582 1726855368.77821: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855368.77823: getting variables 30582 1726855368.77824: in VariableManager get_vars() 30582 1726855368.77834: Calling all_inventory to load vars for managed_node3 30582 1726855368.77836: Calling groups_inventory to load vars for managed_node3 30582 1726855368.77838: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.77843: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.77845: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.77847: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.78995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.80603: done with get_vars() 30582 1726855368.80632: done getting variables 30582 1726855368.80682: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855368.80806: variable 'lsr_description' from source: include params TASK [TEST: I will not get an error when I try to remove an absent profile] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 14:02:48 -0400 (0:00:00.097) 0:01:45.158 ****** 30582 1726855368.80838: entering _queue_task() for managed_node3/debug 30582 1726855368.81259: worker is 1 (out of 1 available) 30582 1726855368.81275: exiting _queue_task() for managed_node3/debug 30582 1726855368.81289: done queuing things up, now waiting for results queue to drain 30582 1726855368.81492: waiting for pending results... 30582 1726855368.81586: running TaskExecutor() for managed_node3/TASK: TEST: I will not get an error when I try to remove an absent profile 30582 1726855368.81708: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020ad 30582 1726855368.81734: variable 'ansible_search_path' from source: unknown 30582 1726855368.81742: variable 'ansible_search_path' from source: unknown 30582 1726855368.81786: calling self._execute() 30582 1726855368.81894: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.81905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.81935: variable 'omit' from source: magic vars 30582 1726855368.82302: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.82371: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.82374: variable 'omit' from source: magic vars 30582 1726855368.82377: variable 'omit' from source: magic vars 30582 1726855368.82479: variable 'lsr_description' from source: include params 30582 1726855368.82506: variable 'omit' from source: magic vars 30582 1726855368.82549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855368.82595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.82622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855368.82645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.82663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.82802: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.82806: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.82811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.82832: Set connection var ansible_timeout to 10 30582 1726855368.82839: Set connection var ansible_connection to ssh 30582 1726855368.82851: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.82859: Set connection var ansible_pipelining to False 30582 1726855368.82871: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.82877: Set connection var ansible_shell_type to sh 30582 1726855368.82905: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.82915: variable 'ansible_connection' from source: unknown 30582 1726855368.82922: variable 'ansible_module_compression' from source: unknown 30582 1726855368.82928: variable 'ansible_shell_type' from source: unknown 30582 1726855368.82933: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.83021: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.83024: variable 'ansible_pipelining' from source: unknown 30582 1726855368.83026: variable 'ansible_timeout' from source: unknown 30582 1726855368.83028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.83105: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.83125: variable 'omit' from source: magic vars 30582 1726855368.83136: starting attempt loop 30582 1726855368.83143: running the handler 30582 1726855368.83194: handler run complete 30582 1726855368.83214: attempt loop complete, returning result 30582 1726855368.83239: _execute() done 30582 1726855368.83243: dumping result to json 30582 1726855368.83245: done dumping result, returning 30582 1726855368.83247: done running TaskExecutor() for managed_node3/TASK: TEST: I will not get an error when I try to remove an absent profile [0affcc66-ac2b-aa83-7d57-0000000020ad] 30582 1726855368.83254: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020ad 30582 1726855368.83610: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020ad 30582 1726855368.83614: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## I will not get an error when I try to remove an absent profile ########## 30582 1726855368.83658: no more pending results, returning what we have 30582 1726855368.83662: results queue empty 30582 1726855368.83663: checking for any_errors_fatal 30582 1726855368.83667: done checking for any_errors_fatal 30582 1726855368.83668: checking for max_fail_percentage 30582 1726855368.83670: done checking for max_fail_percentage 30582 1726855368.83671: checking to see if all hosts have failed and the running result is not ok 30582 1726855368.83671: done checking to see if all hosts have failed 30582 1726855368.83672: getting the remaining hosts for this loop 30582 1726855368.83673: done getting the remaining hosts for this loop 30582 1726855368.83677: getting the next task for host managed_node3 30582 1726855368.83683: done getting next task for host managed_node3 30582 1726855368.83686: ^ task is: TASK: Show item 30582 1726855368.83690: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855368.83694: getting variables 30582 1726855368.83695: in VariableManager get_vars() 30582 1726855368.83735: Calling all_inventory to load vars for managed_node3 30582 1726855368.83738: Calling groups_inventory to load vars for managed_node3 30582 1726855368.83742: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.83752: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.83755: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.83758: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.85444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.87386: done with get_vars() 30582 1726855368.87416: done getting variables 30582 1726855368.87491: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 14:02:48 -0400 (0:00:00.066) 0:01:45.225 ****** 30582 1726855368.87516: entering _queue_task() for managed_node3/debug 30582 1726855368.87792: worker is 1 (out of 1 available) 30582 1726855368.87806: exiting _queue_task() for managed_node3/debug 30582 1726855368.87817: done queuing things up, now waiting for results queue to drain 30582 1726855368.87819: waiting for pending results... 30582 1726855368.88009: running TaskExecutor() for managed_node3/TASK: Show item 30582 1726855368.88074: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020ae 30582 1726855368.88086: variable 'ansible_search_path' from source: unknown 30582 1726855368.88091: variable 'ansible_search_path' from source: unknown 30582 1726855368.88133: variable 'omit' from source: magic vars 30582 1726855368.88251: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.88267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.88271: variable 'omit' from source: magic vars 30582 1726855368.88532: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.88542: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.88548: variable 'omit' from source: magic vars 30582 1726855368.88572: variable 'omit' from source: magic vars 30582 1726855368.88605: variable 'item' from source: unknown 30582 1726855368.88653: variable 'item' from source: unknown 30582 1726855368.88669: variable 'omit' from source: magic vars 30582 1726855368.88703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855368.88728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.88744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855368.88757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.88770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.88791: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.88796: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.88798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.88870: Set connection var ansible_timeout to 10 30582 1726855368.88873: Set connection var ansible_connection to ssh 30582 1726855368.88879: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.88884: Set connection var ansible_pipelining to False 30582 1726855368.88890: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.88892: Set connection var ansible_shell_type to sh 30582 1726855368.88907: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.88914: variable 'ansible_connection' from source: unknown 30582 1726855368.88918: variable 'ansible_module_compression' from source: unknown 30582 1726855368.88920: variable 'ansible_shell_type' from source: unknown 30582 1726855368.88947: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.88949: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.88952: variable 'ansible_pipelining' from source: unknown 30582 1726855368.88954: variable 'ansible_timeout' from source: unknown 30582 1726855368.88956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.89184: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.89195: variable 'omit' from source: magic vars 30582 1726855368.89219: starting attempt loop 30582 1726855368.89223: running the handler 30582 1726855368.89303: variable 'lsr_description' from source: include params 30582 1726855368.89495: variable 'lsr_description' from source: include params 30582 1726855368.89498: handler run complete 30582 1726855368.89500: attempt loop complete, returning result 30582 1726855368.89502: variable 'item' from source: unknown 30582 1726855368.89504: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I will not get an error when I try to remove an absent profile" } 30582 1726855368.89872: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.89879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.89882: variable 'omit' from source: magic vars 30582 1726855368.89940: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.89956: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.89964: variable 'omit' from source: magic vars 30582 1726855368.89982: variable 'omit' from source: magic vars 30582 1726855368.90029: variable 'item' from source: unknown 30582 1726855368.90103: variable 'item' from source: unknown 30582 1726855368.90146: variable 'omit' from source: magic vars 30582 1726855368.90268: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.90272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.90274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.90277: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.90279: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.90281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.90371: Set connection var ansible_timeout to 10 30582 1726855368.90376: Set connection var ansible_connection to ssh 30582 1726855368.90378: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.90380: Set connection var ansible_pipelining to False 30582 1726855368.90382: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.90384: Set connection var ansible_shell_type to sh 30582 1726855368.90386: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.90389: variable 'ansible_connection' from source: unknown 30582 1726855368.90392: variable 'ansible_module_compression' from source: unknown 30582 1726855368.90393: variable 'ansible_shell_type' from source: unknown 30582 1726855368.90395: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.90397: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.90398: variable 'ansible_pipelining' from source: unknown 30582 1726855368.90404: variable 'ansible_timeout' from source: unknown 30582 1726855368.90407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.90580: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.90583: variable 'omit' from source: magic vars 30582 1726855368.90585: starting attempt loop 30582 1726855368.90589: running the handler 30582 1726855368.90591: variable 'lsr_setup' from source: include params 30582 1726855368.90647: variable 'lsr_setup' from source: include params 30582 1726855368.90794: handler run complete 30582 1726855368.90798: attempt loop complete, returning result 30582 1726855368.90805: variable 'item' from source: unknown 30582 1726855368.90893: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove+down_profile.yml" ] } 30582 1726855368.91298: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.91302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.91304: variable 'omit' from source: magic vars 30582 1726855368.91306: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.91308: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.91311: variable 'omit' from source: magic vars 30582 1726855368.91312: variable 'omit' from source: magic vars 30582 1726855368.91317: variable 'item' from source: unknown 30582 1726855368.91380: variable 'item' from source: unknown 30582 1726855368.91410: variable 'omit' from source: magic vars 30582 1726855368.91432: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.91443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.91453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.91471: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.91479: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.91486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.91573: Set connection var ansible_timeout to 10 30582 1726855368.91580: Set connection var ansible_connection to ssh 30582 1726855368.91595: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.91605: Set connection var ansible_pipelining to False 30582 1726855368.91623: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.91633: Set connection var ansible_shell_type to sh 30582 1726855368.91653: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.91661: variable 'ansible_connection' from source: unknown 30582 1726855368.91670: variable 'ansible_module_compression' from source: unknown 30582 1726855368.91678: variable 'ansible_shell_type' from source: unknown 30582 1726855368.91685: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.91695: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.91704: variable 'ansible_pipelining' from source: unknown 30582 1726855368.91711: variable 'ansible_timeout' from source: unknown 30582 1726855368.91718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.91842: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.91894: variable 'omit' from source: magic vars 30582 1726855368.91897: starting attempt loop 30582 1726855368.91899: running the handler 30582 1726855368.91901: variable 'lsr_test' from source: include params 30582 1726855368.91976: variable 'lsr_test' from source: include params 30582 1726855368.92004: handler run complete 30582 1726855368.92023: attempt loop complete, returning result 30582 1726855368.92042: variable 'item' from source: unknown 30582 1726855368.92106: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30582 1726855368.92189: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.92193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.92202: variable 'omit' from source: magic vars 30582 1726855368.92318: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.92321: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.92329: variable 'omit' from source: magic vars 30582 1726855368.92338: variable 'omit' from source: magic vars 30582 1726855368.92364: variable 'item' from source: unknown 30582 1726855368.92411: variable 'item' from source: unknown 30582 1726855368.92422: variable 'omit' from source: magic vars 30582 1726855368.92437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.92442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.92448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.92456: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.92459: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.92461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.92513: Set connection var ansible_timeout to 10 30582 1726855368.92516: Set connection var ansible_connection to ssh 30582 1726855368.92522: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.92524: Set connection var ansible_pipelining to False 30582 1726855368.92529: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.92532: Set connection var ansible_shell_type to sh 30582 1726855368.92545: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.92547: variable 'ansible_connection' from source: unknown 30582 1726855368.92549: variable 'ansible_module_compression' from source: unknown 30582 1726855368.92552: variable 'ansible_shell_type' from source: unknown 30582 1726855368.92554: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.92558: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.92562: variable 'ansible_pipelining' from source: unknown 30582 1726855368.92564: variable 'ansible_timeout' from source: unknown 30582 1726855368.92571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.92630: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.92638: variable 'omit' from source: magic vars 30582 1726855368.92642: starting attempt loop 30582 1726855368.92644: running the handler 30582 1726855368.92660: variable 'lsr_assert' from source: include params 30582 1726855368.92708: variable 'lsr_assert' from source: include params 30582 1726855368.92723: handler run complete 30582 1726855368.92732: attempt loop complete, returning result 30582 1726855368.92744: variable 'item' from source: unknown 30582 1726855368.92789: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml", "tasks/get_NetworkManager_NVR.yml" ] } 30582 1726855368.92872: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.92875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.92877: variable 'omit' from source: magic vars 30582 1726855368.93015: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.93023: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.93026: variable 'omit' from source: magic vars 30582 1726855368.93035: variable 'omit' from source: magic vars 30582 1726855368.93061: variable 'item' from source: unknown 30582 1726855368.93108: variable 'item' from source: unknown 30582 1726855368.93118: variable 'omit' from source: magic vars 30582 1726855368.93133: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.93139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.93144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.93153: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.93155: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.93157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.93207: Set connection var ansible_timeout to 10 30582 1726855368.93210: Set connection var ansible_connection to ssh 30582 1726855368.93213: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.93223: Set connection var ansible_pipelining to False 30582 1726855368.93225: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.93228: Set connection var ansible_shell_type to sh 30582 1726855368.93241: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.93243: variable 'ansible_connection' from source: unknown 30582 1726855368.93245: variable 'ansible_module_compression' from source: unknown 30582 1726855368.93248: variable 'ansible_shell_type' from source: unknown 30582 1726855368.93250: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.93252: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.93256: variable 'ansible_pipelining' from source: unknown 30582 1726855368.93258: variable 'ansible_timeout' from source: unknown 30582 1726855368.93262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.93328: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.93332: variable 'omit' from source: magic vars 30582 1726855368.93334: starting attempt loop 30582 1726855368.93337: running the handler 30582 1726855368.93350: variable 'lsr_assert_when' from source: include params 30582 1726855368.93398: variable 'lsr_assert_when' from source: include params 30582 1726855368.93460: variable 'network_provider' from source: set_fact 30582 1726855368.93488: handler run complete 30582 1726855368.93499: attempt loop complete, returning result 30582 1726855368.93509: variable 'item' from source: unknown 30582 1726855368.93576: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30582 1726855368.93650: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.93653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.93655: variable 'omit' from source: magic vars 30582 1726855368.94142: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.94146: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.94149: variable 'omit' from source: magic vars 30582 1726855368.94151: variable 'omit' from source: magic vars 30582 1726855368.94153: variable 'item' from source: unknown 30582 1726855368.94156: variable 'item' from source: unknown 30582 1726855368.94158: variable 'omit' from source: magic vars 30582 1726855368.94160: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.94162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.94164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.94166: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.94168: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.94170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.94171: Set connection var ansible_timeout to 10 30582 1726855368.94173: Set connection var ansible_connection to ssh 30582 1726855368.94175: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.94176: Set connection var ansible_pipelining to False 30582 1726855368.94178: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.94180: Set connection var ansible_shell_type to sh 30582 1726855368.94182: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.94184: variable 'ansible_connection' from source: unknown 30582 1726855368.94185: variable 'ansible_module_compression' from source: unknown 30582 1726855368.94189: variable 'ansible_shell_type' from source: unknown 30582 1726855368.94192: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.94194: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.94195: variable 'ansible_pipelining' from source: unknown 30582 1726855368.94197: variable 'ansible_timeout' from source: unknown 30582 1726855368.94199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.94342: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.94345: variable 'omit' from source: magic vars 30582 1726855368.94348: starting attempt loop 30582 1726855368.94350: running the handler 30582 1726855368.94352: variable 'lsr_fail_debug' from source: play vars 30582 1726855368.94354: variable 'lsr_fail_debug' from source: play vars 30582 1726855368.94356: handler run complete 30582 1726855368.94451: attempt loop complete, returning result 30582 1726855368.94454: variable 'item' from source: unknown 30582 1726855368.94456: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30582 1726855368.94524: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.94528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.94530: variable 'omit' from source: magic vars 30582 1726855368.94997: variable 'ansible_distribution_major_version' from source: facts 30582 1726855368.95000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855368.95002: variable 'omit' from source: magic vars 30582 1726855368.95004: variable 'omit' from source: magic vars 30582 1726855368.95006: variable 'item' from source: unknown 30582 1726855368.95008: variable 'item' from source: unknown 30582 1726855368.95010: variable 'omit' from source: magic vars 30582 1726855368.95012: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855368.95019: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.95021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855368.95023: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855368.95025: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.95027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.95029: Set connection var ansible_timeout to 10 30582 1726855368.95031: Set connection var ansible_connection to ssh 30582 1726855368.95033: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855368.95035: Set connection var ansible_pipelining to False 30582 1726855368.95037: Set connection var ansible_shell_executable to /bin/sh 30582 1726855368.95039: Set connection var ansible_shell_type to sh 30582 1726855368.95041: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.95042: variable 'ansible_connection' from source: unknown 30582 1726855368.95044: variable 'ansible_module_compression' from source: unknown 30582 1726855368.95046: variable 'ansible_shell_type' from source: unknown 30582 1726855368.95048: variable 'ansible_shell_executable' from source: unknown 30582 1726855368.95050: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855368.95054: variable 'ansible_pipelining' from source: unknown 30582 1726855368.95056: variable 'ansible_timeout' from source: unknown 30582 1726855368.95058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855368.95249: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855368.95261: variable 'omit' from source: magic vars 30582 1726855368.95269: starting attempt loop 30582 1726855368.95276: running the handler 30582 1726855368.95494: variable 'lsr_cleanup' from source: include params 30582 1726855368.95497: variable 'lsr_cleanup' from source: include params 30582 1726855368.95507: handler run complete 30582 1726855368.95524: attempt loop complete, returning result 30582 1726855368.95614: variable 'item' from source: unknown 30582 1726855368.95737: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml", "tasks/check_network_dns.yml" ] } 30582 1726855368.95915: dumping result to json 30582 1726855368.95918: done dumping result, returning 30582 1726855368.95920: done running TaskExecutor() for managed_node3/TASK: Show item [0affcc66-ac2b-aa83-7d57-0000000020ae] 30582 1726855368.96040: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020ae 30582 1726855368.96082: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020ae 30582 1726855368.96084: WORKER PROCESS EXITING 30582 1726855368.96196: no more pending results, returning what we have 30582 1726855368.96200: results queue empty 30582 1726855368.96201: checking for any_errors_fatal 30582 1726855368.96206: done checking for any_errors_fatal 30582 1726855368.96207: checking for max_fail_percentage 30582 1726855368.96208: done checking for max_fail_percentage 30582 1726855368.96209: checking to see if all hosts have failed and the running result is not ok 30582 1726855368.96210: done checking to see if all hosts have failed 30582 1726855368.96211: getting the remaining hosts for this loop 30582 1726855368.96212: done getting the remaining hosts for this loop 30582 1726855368.96215: getting the next task for host managed_node3 30582 1726855368.96221: done getting next task for host managed_node3 30582 1726855368.96223: ^ task is: TASK: Include the task 'show_interfaces.yml' 30582 1726855368.96226: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855368.96229: getting variables 30582 1726855368.96231: in VariableManager get_vars() 30582 1726855368.96266: Calling all_inventory to load vars for managed_node3 30582 1726855368.96268: Calling groups_inventory to load vars for managed_node3 30582 1726855368.96271: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855368.96280: Calling all_plugins_play to load vars for managed_node3 30582 1726855368.96283: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855368.96285: Calling groups_plugins_play to load vars for managed_node3 30582 1726855368.97646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855368.99256: done with get_vars() 30582 1726855368.99297: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 14:02:48 -0400 (0:00:00.118) 0:01:45.343 ****** 30582 1726855368.99401: entering _queue_task() for managed_node3/include_tasks 30582 1726855368.99867: worker is 1 (out of 1 available) 30582 1726855368.99879: exiting _queue_task() for managed_node3/include_tasks 30582 1726855368.99892: done queuing things up, now waiting for results queue to drain 30582 1726855368.99893: waiting for pending results... 30582 1726855369.00282: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 30582 1726855369.00290: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020af 30582 1726855369.00294: variable 'ansible_search_path' from source: unknown 30582 1726855369.00297: variable 'ansible_search_path' from source: unknown 30582 1726855369.00326: calling self._execute() 30582 1726855369.00433: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.00446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.00460: variable 'omit' from source: magic vars 30582 1726855369.00850: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.00869: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.00880: _execute() done 30582 1726855369.00889: dumping result to json 30582 1726855369.00897: done dumping result, returning 30582 1726855369.00908: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcc66-ac2b-aa83-7d57-0000000020af] 30582 1726855369.00926: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020af 30582 1726855369.01125: no more pending results, returning what we have 30582 1726855369.01131: in VariableManager get_vars() 30582 1726855369.01186: Calling all_inventory to load vars for managed_node3 30582 1726855369.01191: Calling groups_inventory to load vars for managed_node3 30582 1726855369.01195: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.01405: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.01408: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.01411: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.02124: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020af 30582 1726855369.02127: WORKER PROCESS EXITING 30582 1726855369.03061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.04604: done with get_vars() 30582 1726855369.04640: variable 'ansible_search_path' from source: unknown 30582 1726855369.04642: variable 'ansible_search_path' from source: unknown 30582 1726855369.04685: we have included files to process 30582 1726855369.04686: generating all_blocks data 30582 1726855369.04691: done generating all_blocks data 30582 1726855369.04696: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855369.04697: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855369.04699: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30582 1726855369.04817: in VariableManager get_vars() 30582 1726855369.04841: done with get_vars() 30582 1726855369.04966: done processing included file 30582 1726855369.04969: iterating over new_blocks loaded from include file 30582 1726855369.04970: in VariableManager get_vars() 30582 1726855369.04986: done with get_vars() 30582 1726855369.04990: filtering new block on tags 30582 1726855369.05026: done filtering new block on tags 30582 1726855369.05029: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 30582 1726855369.05034: extending task lists for all hosts with included blocks 30582 1726855369.05500: done extending task lists 30582 1726855369.05502: done processing included files 30582 1726855369.05502: results queue empty 30582 1726855369.05503: checking for any_errors_fatal 30582 1726855369.05509: done checking for any_errors_fatal 30582 1726855369.05510: checking for max_fail_percentage 30582 1726855369.05511: done checking for max_fail_percentage 30582 1726855369.05512: checking to see if all hosts have failed and the running result is not ok 30582 1726855369.05513: done checking to see if all hosts have failed 30582 1726855369.05513: getting the remaining hosts for this loop 30582 1726855369.05514: done getting the remaining hosts for this loop 30582 1726855369.05517: getting the next task for host managed_node3 30582 1726855369.05525: done getting next task for host managed_node3 30582 1726855369.05526: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30582 1726855369.05530: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855369.05532: getting variables 30582 1726855369.05533: in VariableManager get_vars() 30582 1726855369.05541: Calling all_inventory to load vars for managed_node3 30582 1726855369.05543: Calling groups_inventory to load vars for managed_node3 30582 1726855369.05544: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.05548: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.05550: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.05551: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.06265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.11900: done with get_vars() 30582 1726855369.11922: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 14:02:49 -0400 (0:00:00.125) 0:01:45.469 ****** 30582 1726855369.11978: entering _queue_task() for managed_node3/include_tasks 30582 1726855369.12260: worker is 1 (out of 1 available) 30582 1726855369.12274: exiting _queue_task() for managed_node3/include_tasks 30582 1726855369.12290: done queuing things up, now waiting for results queue to drain 30582 1726855369.12292: waiting for pending results... 30582 1726855369.12478: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 30582 1726855369.12573: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020d6 30582 1726855369.12585: variable 'ansible_search_path' from source: unknown 30582 1726855369.12590: variable 'ansible_search_path' from source: unknown 30582 1726855369.12622: calling self._execute() 30582 1726855369.12702: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.12706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.12715: variable 'omit' from source: magic vars 30582 1726855369.13010: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.13020: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.13026: _execute() done 30582 1726855369.13029: dumping result to json 30582 1726855369.13033: done dumping result, returning 30582 1726855369.13040: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcc66-ac2b-aa83-7d57-0000000020d6] 30582 1726855369.13045: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020d6 30582 1726855369.13145: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020d6 30582 1726855369.13148: WORKER PROCESS EXITING 30582 1726855369.13200: no more pending results, returning what we have 30582 1726855369.13206: in VariableManager get_vars() 30582 1726855369.13252: Calling all_inventory to load vars for managed_node3 30582 1726855369.13255: Calling groups_inventory to load vars for managed_node3 30582 1726855369.13258: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.13272: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.13274: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.13279: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.14123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.14996: done with get_vars() 30582 1726855369.15016: variable 'ansible_search_path' from source: unknown 30582 1726855369.15017: variable 'ansible_search_path' from source: unknown 30582 1726855369.15047: we have included files to process 30582 1726855369.15048: generating all_blocks data 30582 1726855369.15049: done generating all_blocks data 30582 1726855369.15051: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855369.15051: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855369.15054: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30582 1726855369.15260: done processing included file 30582 1726855369.15263: iterating over new_blocks loaded from include file 30582 1726855369.15265: in VariableManager get_vars() 30582 1726855369.15282: done with get_vars() 30582 1726855369.15284: filtering new block on tags 30582 1726855369.15328: done filtering new block on tags 30582 1726855369.15330: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 30582 1726855369.15335: extending task lists for all hosts with included blocks 30582 1726855369.15484: done extending task lists 30582 1726855369.15485: done processing included files 30582 1726855369.15486: results queue empty 30582 1726855369.15486: checking for any_errors_fatal 30582 1726855369.15491: done checking for any_errors_fatal 30582 1726855369.15491: checking for max_fail_percentage 30582 1726855369.15493: done checking for max_fail_percentage 30582 1726855369.15493: checking to see if all hosts have failed and the running result is not ok 30582 1726855369.15494: done checking to see if all hosts have failed 30582 1726855369.15495: getting the remaining hosts for this loop 30582 1726855369.15496: done getting the remaining hosts for this loop 30582 1726855369.15498: getting the next task for host managed_node3 30582 1726855369.15503: done getting next task for host managed_node3 30582 1726855369.15505: ^ task is: TASK: Gather current interface info 30582 1726855369.15508: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855369.15511: getting variables 30582 1726855369.15512: in VariableManager get_vars() 30582 1726855369.15523: Calling all_inventory to load vars for managed_node3 30582 1726855369.15525: Calling groups_inventory to load vars for managed_node3 30582 1726855369.15527: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.15533: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.15535: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.15537: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.16658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.17778: done with get_vars() 30582 1726855369.17799: done getting variables 30582 1726855369.17830: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 14:02:49 -0400 (0:00:00.058) 0:01:45.528 ****** 30582 1726855369.17855: entering _queue_task() for managed_node3/command 30582 1726855369.18131: worker is 1 (out of 1 available) 30582 1726855369.18147: exiting _queue_task() for managed_node3/command 30582 1726855369.18160: done queuing things up, now waiting for results queue to drain 30582 1726855369.18162: waiting for pending results... 30582 1726855369.18353: running TaskExecutor() for managed_node3/TASK: Gather current interface info 30582 1726855369.18444: in run() - task 0affcc66-ac2b-aa83-7d57-000000002111 30582 1726855369.18457: variable 'ansible_search_path' from source: unknown 30582 1726855369.18461: variable 'ansible_search_path' from source: unknown 30582 1726855369.18497: calling self._execute() 30582 1726855369.18574: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.18577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.18585: variable 'omit' from source: magic vars 30582 1726855369.18881: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.18893: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.18900: variable 'omit' from source: magic vars 30582 1726855369.18939: variable 'omit' from source: magic vars 30582 1726855369.18964: variable 'omit' from source: magic vars 30582 1726855369.19000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855369.19026: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855369.19044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855369.19058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855369.19073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855369.19098: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855369.19101: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.19104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.19179: Set connection var ansible_timeout to 10 30582 1726855369.19182: Set connection var ansible_connection to ssh 30582 1726855369.19191: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855369.19193: Set connection var ansible_pipelining to False 30582 1726855369.19199: Set connection var ansible_shell_executable to /bin/sh 30582 1726855369.19202: Set connection var ansible_shell_type to sh 30582 1726855369.19217: variable 'ansible_shell_executable' from source: unknown 30582 1726855369.19220: variable 'ansible_connection' from source: unknown 30582 1726855369.19224: variable 'ansible_module_compression' from source: unknown 30582 1726855369.19226: variable 'ansible_shell_type' from source: unknown 30582 1726855369.19229: variable 'ansible_shell_executable' from source: unknown 30582 1726855369.19231: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.19233: variable 'ansible_pipelining' from source: unknown 30582 1726855369.19236: variable 'ansible_timeout' from source: unknown 30582 1726855369.19238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.19341: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855369.19349: variable 'omit' from source: magic vars 30582 1726855369.19356: starting attempt loop 30582 1726855369.19360: running the handler 30582 1726855369.19375: _low_level_execute_command(): starting 30582 1726855369.19382: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855369.19892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855369.19897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855369.19900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855369.19942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855369.19956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855369.20037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855369.21748: stdout chunk (state=3): >>>/root <<< 30582 1726855369.21855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855369.21889: stderr chunk (state=3): >>><<< 30582 1726855369.21893: stdout chunk (state=3): >>><<< 30582 1726855369.21915: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855369.21928: _low_level_execute_command(): starting 30582 1726855369.21934: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293 `" && echo ansible-tmp-1726855369.2191522-35520-167038562267293="` echo /root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293 `" ) && sleep 0' 30582 1726855369.22392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855369.22396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855369.22399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855369.22401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855369.22411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855369.22456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855369.22460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855369.22462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855369.22524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855369.24428: stdout chunk (state=3): >>>ansible-tmp-1726855369.2191522-35520-167038562267293=/root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293 <<< 30582 1726855369.24530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855369.24554: stderr chunk (state=3): >>><<< 30582 1726855369.24558: stdout chunk (state=3): >>><<< 30582 1726855369.24579: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855369.2191522-35520-167038562267293=/root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855369.24608: variable 'ansible_module_compression' from source: unknown 30582 1726855369.24653: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855369.24690: variable 'ansible_facts' from source: unknown 30582 1726855369.24744: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/AnsiballZ_command.py 30582 1726855369.24850: Sending initial data 30582 1726855369.24854: Sent initial data (156 bytes) 30582 1726855369.25304: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855369.25308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855369.25310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855369.25313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855369.25315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855369.25368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855369.25371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855369.25373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855369.25437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855369.27001: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855369.27055: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855369.27118: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp3wfrfrwo /root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/AnsiballZ_command.py <<< 30582 1726855369.27121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/AnsiballZ_command.py" <<< 30582 1726855369.27178: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp3wfrfrwo" to remote "/root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/AnsiballZ_command.py" <<< 30582 1726855369.28109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855369.28113: stdout chunk (state=3): >>><<< 30582 1726855369.28116: stderr chunk (state=3): >>><<< 30582 1726855369.28118: done transferring module to remote 30582 1726855369.28120: _low_level_execute_command(): starting 30582 1726855369.28122: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/ /root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/AnsiballZ_command.py && sleep 0' 30582 1726855369.28683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855369.28700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855369.28713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855369.28735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855369.28751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855369.28763: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855369.28852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855369.28890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855369.28907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855369.28927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855369.29022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855369.30834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855369.30848: stdout chunk (state=3): >>><<< 30582 1726855369.30866: stderr chunk (state=3): >>><<< 30582 1726855369.30962: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855369.30968: _low_level_execute_command(): starting 30582 1726855369.30971: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/AnsiballZ_command.py && sleep 0' 30582 1726855369.31607: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855369.31640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855369.31657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855369.31682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855369.31785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855369.47116: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:02:49.467030", "end": "2024-09-20 14:02:49.470305", "delta": "0:00:00.003275", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855369.48678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855369.48685: stdout chunk (state=3): >>><<< 30582 1726855369.48699: stderr chunk (state=3): >>><<< 30582 1726855369.48715: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 14:02:49.467030", "end": "2024-09-20 14:02:49.470305", "delta": "0:00:00.003275", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855369.48744: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855369.48751: _low_level_execute_command(): starting 30582 1726855369.48756: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855369.2191522-35520-167038562267293/ > /dev/null 2>&1 && sleep 0' 30582 1726855369.49208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855369.49211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855369.49214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855369.49216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855369.49267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855369.49271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855369.49343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855369.51165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855369.51193: stderr chunk (state=3): >>><<< 30582 1726855369.51196: stdout chunk (state=3): >>><<< 30582 1726855369.51211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855369.51216: handler run complete 30582 1726855369.51235: Evaluated conditional (False): False 30582 1726855369.51244: attempt loop complete, returning result 30582 1726855369.51247: _execute() done 30582 1726855369.51249: dumping result to json 30582 1726855369.51254: done dumping result, returning 30582 1726855369.51262: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcc66-ac2b-aa83-7d57-000000002111] 30582 1726855369.51269: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002111 30582 1726855369.51369: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002111 30582 1726855369.51372: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003275", "end": "2024-09-20 14:02:49.470305", "rc": 0, "start": "2024-09-20 14:02:49.467030" } STDOUT: bonding_masters eth0 lo rpltstbr 30582 1726855369.51454: no more pending results, returning what we have 30582 1726855369.51458: results queue empty 30582 1726855369.51459: checking for any_errors_fatal 30582 1726855369.51461: done checking for any_errors_fatal 30582 1726855369.51461: checking for max_fail_percentage 30582 1726855369.51463: done checking for max_fail_percentage 30582 1726855369.51464: checking to see if all hosts have failed and the running result is not ok 30582 1726855369.51465: done checking to see if all hosts have failed 30582 1726855369.51466: getting the remaining hosts for this loop 30582 1726855369.51468: done getting the remaining hosts for this loop 30582 1726855369.51471: getting the next task for host managed_node3 30582 1726855369.51479: done getting next task for host managed_node3 30582 1726855369.51481: ^ task is: TASK: Set current_interfaces 30582 1726855369.51493: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855369.51499: getting variables 30582 1726855369.51500: in VariableManager get_vars() 30582 1726855369.51542: Calling all_inventory to load vars for managed_node3 30582 1726855369.51545: Calling groups_inventory to load vars for managed_node3 30582 1726855369.51548: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.51558: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.51561: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.51563: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.52399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.53742: done with get_vars() 30582 1726855369.53770: done getting variables 30582 1726855369.53833: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 14:02:49 -0400 (0:00:00.360) 0:01:45.888 ****** 30582 1726855369.53870: entering _queue_task() for managed_node3/set_fact 30582 1726855369.54236: worker is 1 (out of 1 available) 30582 1726855369.54250: exiting _queue_task() for managed_node3/set_fact 30582 1726855369.54264: done queuing things up, now waiting for results queue to drain 30582 1726855369.54266: waiting for pending results... 30582 1726855369.54708: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 30582 1726855369.54713: in run() - task 0affcc66-ac2b-aa83-7d57-000000002112 30582 1726855369.54727: variable 'ansible_search_path' from source: unknown 30582 1726855369.54734: variable 'ansible_search_path' from source: unknown 30582 1726855369.54774: calling self._execute() 30582 1726855369.54882: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.54896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.54910: variable 'omit' from source: magic vars 30582 1726855369.55311: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.55328: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.55341: variable 'omit' from source: magic vars 30582 1726855369.55404: variable 'omit' from source: magic vars 30582 1726855369.55519: variable '_current_interfaces' from source: set_fact 30582 1726855369.55594: variable 'omit' from source: magic vars 30582 1726855369.55639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855369.55701: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855369.55714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855369.55737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855369.55755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855369.55808: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855369.55812: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.55814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.55917: Set connection var ansible_timeout to 10 30582 1726855369.56026: Set connection var ansible_connection to ssh 30582 1726855369.56029: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855369.56031: Set connection var ansible_pipelining to False 30582 1726855369.56033: Set connection var ansible_shell_executable to /bin/sh 30582 1726855369.56035: Set connection var ansible_shell_type to sh 30582 1726855369.56037: variable 'ansible_shell_executable' from source: unknown 30582 1726855369.56039: variable 'ansible_connection' from source: unknown 30582 1726855369.56041: variable 'ansible_module_compression' from source: unknown 30582 1726855369.56043: variable 'ansible_shell_type' from source: unknown 30582 1726855369.56045: variable 'ansible_shell_executable' from source: unknown 30582 1726855369.56047: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.56048: variable 'ansible_pipelining' from source: unknown 30582 1726855369.56050: variable 'ansible_timeout' from source: unknown 30582 1726855369.56052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.56155: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855369.56174: variable 'omit' from source: magic vars 30582 1726855369.56185: starting attempt loop 30582 1726855369.56194: running the handler 30582 1726855369.56210: handler run complete 30582 1726855369.56225: attempt loop complete, returning result 30582 1726855369.56233: _execute() done 30582 1726855369.56244: dumping result to json 30582 1726855369.56255: done dumping result, returning 30582 1726855369.56269: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcc66-ac2b-aa83-7d57-000000002112] 30582 1726855369.56280: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002112 30582 1726855369.56626: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002112 30582 1726855369.56630: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30582 1726855369.56689: no more pending results, returning what we have 30582 1726855369.56692: results queue empty 30582 1726855369.56693: checking for any_errors_fatal 30582 1726855369.56700: done checking for any_errors_fatal 30582 1726855369.56701: checking for max_fail_percentage 30582 1726855369.56703: done checking for max_fail_percentage 30582 1726855369.56704: checking to see if all hosts have failed and the running result is not ok 30582 1726855369.56705: done checking to see if all hosts have failed 30582 1726855369.56705: getting the remaining hosts for this loop 30582 1726855369.56707: done getting the remaining hosts for this loop 30582 1726855369.56711: getting the next task for host managed_node3 30582 1726855369.56720: done getting next task for host managed_node3 30582 1726855369.56723: ^ task is: TASK: Show current_interfaces 30582 1726855369.56727: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855369.56732: getting variables 30582 1726855369.56733: in VariableManager get_vars() 30582 1726855369.56774: Calling all_inventory to load vars for managed_node3 30582 1726855369.56778: Calling groups_inventory to load vars for managed_node3 30582 1726855369.56781: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.56795: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.56799: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.56802: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.58430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.59989: done with get_vars() 30582 1726855369.60020: done getting variables 30582 1726855369.60083: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 14:02:49 -0400 (0:00:00.062) 0:01:45.951 ****** 30582 1726855369.60121: entering _queue_task() for managed_node3/debug 30582 1726855369.60500: worker is 1 (out of 1 available) 30582 1726855369.60515: exiting _queue_task() for managed_node3/debug 30582 1726855369.60527: done queuing things up, now waiting for results queue to drain 30582 1726855369.60528: waiting for pending results... 30582 1726855369.60817: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 30582 1726855369.60945: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020d7 30582 1726855369.60965: variable 'ansible_search_path' from source: unknown 30582 1726855369.60971: variable 'ansible_search_path' from source: unknown 30582 1726855369.61014: calling self._execute() 30582 1726855369.61116: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.61127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.61142: variable 'omit' from source: magic vars 30582 1726855369.61521: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.61538: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.61549: variable 'omit' from source: magic vars 30582 1726855369.61603: variable 'omit' from source: magic vars 30582 1726855369.61701: variable 'current_interfaces' from source: set_fact 30582 1726855369.61734: variable 'omit' from source: magic vars 30582 1726855369.61801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855369.61821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855369.61846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855369.61869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855369.61909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855369.61925: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855369.61933: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.61941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.62051: Set connection var ansible_timeout to 10 30582 1726855369.62092: Set connection var ansible_connection to ssh 30582 1726855369.62095: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855369.62097: Set connection var ansible_pipelining to False 30582 1726855369.62099: Set connection var ansible_shell_executable to /bin/sh 30582 1726855369.62101: Set connection var ansible_shell_type to sh 30582 1726855369.62112: variable 'ansible_shell_executable' from source: unknown 30582 1726855369.62124: variable 'ansible_connection' from source: unknown 30582 1726855369.62132: variable 'ansible_module_compression' from source: unknown 30582 1726855369.62138: variable 'ansible_shell_type' from source: unknown 30582 1726855369.62233: variable 'ansible_shell_executable' from source: unknown 30582 1726855369.62237: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.62239: variable 'ansible_pipelining' from source: unknown 30582 1726855369.62241: variable 'ansible_timeout' from source: unknown 30582 1726855369.62243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.62313: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855369.62331: variable 'omit' from source: magic vars 30582 1726855369.62345: starting attempt loop 30582 1726855369.62352: running the handler 30582 1726855369.62405: handler run complete 30582 1726855369.62423: attempt loop complete, returning result 30582 1726855369.62430: _execute() done 30582 1726855369.62436: dumping result to json 30582 1726855369.62445: done dumping result, returning 30582 1726855369.62458: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcc66-ac2b-aa83-7d57-0000000020d7] 30582 1726855369.62492: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020d7 30582 1726855369.62803: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020d7 30582 1726855369.62806: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30582 1726855369.62852: no more pending results, returning what we have 30582 1726855369.62856: results queue empty 30582 1726855369.62857: checking for any_errors_fatal 30582 1726855369.62862: done checking for any_errors_fatal 30582 1726855369.62863: checking for max_fail_percentage 30582 1726855369.62864: done checking for max_fail_percentage 30582 1726855369.62865: checking to see if all hosts have failed and the running result is not ok 30582 1726855369.62866: done checking to see if all hosts have failed 30582 1726855369.62866: getting the remaining hosts for this loop 30582 1726855369.62868: done getting the remaining hosts for this loop 30582 1726855369.62872: getting the next task for host managed_node3 30582 1726855369.62880: done getting next task for host managed_node3 30582 1726855369.62883: ^ task is: TASK: Setup 30582 1726855369.62885: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855369.62891: getting variables 30582 1726855369.62893: in VariableManager get_vars() 30582 1726855369.62931: Calling all_inventory to load vars for managed_node3 30582 1726855369.62934: Calling groups_inventory to load vars for managed_node3 30582 1726855369.62937: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.62948: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.62951: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.62953: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.64530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.66875: done with get_vars() 30582 1726855369.66912: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 14:02:49 -0400 (0:00:00.070) 0:01:46.022 ****** 30582 1726855369.67211: entering _queue_task() for managed_node3/include_tasks 30582 1726855369.67934: worker is 1 (out of 1 available) 30582 1726855369.67945: exiting _queue_task() for managed_node3/include_tasks 30582 1726855369.67955: done queuing things up, now waiting for results queue to drain 30582 1726855369.67956: waiting for pending results... 30582 1726855369.68390: running TaskExecutor() for managed_node3/TASK: Setup 30582 1726855369.68628: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020b0 30582 1726855369.68654: variable 'ansible_search_path' from source: unknown 30582 1726855369.68664: variable 'ansible_search_path' from source: unknown 30582 1726855369.68872: variable 'lsr_setup' from source: include params 30582 1726855369.69075: variable 'lsr_setup' from source: include params 30582 1726855369.69169: variable 'omit' from source: magic vars 30582 1726855369.69330: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.69346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.69361: variable 'omit' from source: magic vars 30582 1726855369.69760: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.69775: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.69785: variable 'item' from source: unknown 30582 1726855369.69877: variable 'item' from source: unknown 30582 1726855369.70071: variable 'item' from source: unknown 30582 1726855369.70074: variable 'item' from source: unknown 30582 1726855369.70316: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.70319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.70322: variable 'omit' from source: magic vars 30582 1726855369.70403: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.70415: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.70430: variable 'item' from source: unknown 30582 1726855369.70533: variable 'item' from source: unknown 30582 1726855369.70536: variable 'item' from source: unknown 30582 1726855369.70593: variable 'item' from source: unknown 30582 1726855369.70824: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.70827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.70830: variable 'omit' from source: magic vars 30582 1726855369.70897: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.70908: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.70918: variable 'item' from source: unknown 30582 1726855369.70985: variable 'item' from source: unknown 30582 1726855369.71023: variable 'item' from source: unknown 30582 1726855369.71091: variable 'item' from source: unknown 30582 1726855369.71271: dumping result to json 30582 1726855369.71274: done dumping result, returning 30582 1726855369.71277: done running TaskExecutor() for managed_node3/TASK: Setup [0affcc66-ac2b-aa83-7d57-0000000020b0] 30582 1726855369.71280: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b0 30582 1726855369.71322: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b0 30582 1726855369.71326: WORKER PROCESS EXITING 30582 1726855369.71415: no more pending results, returning what we have 30582 1726855369.71423: in VariableManager get_vars() 30582 1726855369.71476: Calling all_inventory to load vars for managed_node3 30582 1726855369.71480: Calling groups_inventory to load vars for managed_node3 30582 1726855369.71484: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.71501: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.71506: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.71509: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.74611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.76530: done with get_vars() 30582 1726855369.76558: variable 'ansible_search_path' from source: unknown 30582 1726855369.76560: variable 'ansible_search_path' from source: unknown 30582 1726855369.76606: variable 'ansible_search_path' from source: unknown 30582 1726855369.76607: variable 'ansible_search_path' from source: unknown 30582 1726855369.76641: variable 'ansible_search_path' from source: unknown 30582 1726855369.76643: variable 'ansible_search_path' from source: unknown 30582 1726855369.76673: we have included files to process 30582 1726855369.76674: generating all_blocks data 30582 1726855369.76676: done generating all_blocks data 30582 1726855369.76681: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855369.76683: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855369.76685: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30582 1726855369.76926: done processing included file 30582 1726855369.76928: iterating over new_blocks loaded from include file 30582 1726855369.76930: in VariableManager get_vars() 30582 1726855369.76948: done with get_vars() 30582 1726855369.76950: filtering new block on tags 30582 1726855369.76985: done filtering new block on tags 30582 1726855369.76991: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node3 => (item=tasks/create_bridge_profile.yml) 30582 1726855369.76996: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855369.76997: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855369.77000: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30582 1726855369.77171: done processing included file 30582 1726855369.77172: iterating over new_blocks loaded from include file 30582 1726855369.77174: in VariableManager get_vars() 30582 1726855369.77193: done with get_vars() 30582 1726855369.77195: filtering new block on tags 30582 1726855369.77217: done filtering new block on tags 30582 1726855369.77219: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node3 => (item=tasks/activate_profile.yml) 30582 1726855369.77223: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30582 1726855369.77224: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30582 1726855369.77226: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30582 1726855369.77316: done processing included file 30582 1726855369.77318: iterating over new_blocks loaded from include file 30582 1726855369.77319: in VariableManager get_vars() 30582 1726855369.77335: done with get_vars() 30582 1726855369.77338: filtering new block on tags 30582 1726855369.77359: done filtering new block on tags 30582 1726855369.77362: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node3 => (item=tasks/remove+down_profile.yml) 30582 1726855369.77365: extending task lists for all hosts with included blocks 30582 1726855369.78908: done extending task lists 30582 1726855369.78910: done processing included files 30582 1726855369.78910: results queue empty 30582 1726855369.78911: checking for any_errors_fatal 30582 1726855369.78915: done checking for any_errors_fatal 30582 1726855369.78915: checking for max_fail_percentage 30582 1726855369.78917: done checking for max_fail_percentage 30582 1726855369.78917: checking to see if all hosts have failed and the running result is not ok 30582 1726855369.78918: done checking to see if all hosts have failed 30582 1726855369.78919: getting the remaining hosts for this loop 30582 1726855369.78920: done getting the remaining hosts for this loop 30582 1726855369.78923: getting the next task for host managed_node3 30582 1726855369.78927: done getting next task for host managed_node3 30582 1726855369.78929: ^ task is: TASK: Include network role 30582 1726855369.78933: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855369.78936: getting variables 30582 1726855369.78937: in VariableManager get_vars() 30582 1726855369.78949: Calling all_inventory to load vars for managed_node3 30582 1726855369.78951: Calling groups_inventory to load vars for managed_node3 30582 1726855369.78954: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.78960: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.78962: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.78965: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.80235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.81744: done with get_vars() 30582 1726855369.81767: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 14:02:49 -0400 (0:00:00.146) 0:01:46.168 ****** 30582 1726855369.81848: entering _queue_task() for managed_node3/include_role 30582 1726855369.82233: worker is 1 (out of 1 available) 30582 1726855369.82249: exiting _queue_task() for managed_node3/include_role 30582 1726855369.82263: done queuing things up, now waiting for results queue to drain 30582 1726855369.82265: waiting for pending results... 30582 1726855369.82544: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855369.82656: in run() - task 0affcc66-ac2b-aa83-7d57-000000002139 30582 1726855369.82696: variable 'ansible_search_path' from source: unknown 30582 1726855369.82700: variable 'ansible_search_path' from source: unknown 30582 1726855369.82747: calling self._execute() 30582 1726855369.82845: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.82894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.82899: variable 'omit' from source: magic vars 30582 1726855369.83295: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.83313: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.83326: _execute() done 30582 1726855369.83397: dumping result to json 30582 1726855369.83401: done dumping result, returning 30582 1726855369.83405: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-000000002139] 30582 1726855369.83407: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002139 30582 1726855369.83615: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002139 30582 1726855369.83619: WORKER PROCESS EXITING 30582 1726855369.83648: no more pending results, returning what we have 30582 1726855369.83655: in VariableManager get_vars() 30582 1726855369.83715: Calling all_inventory to load vars for managed_node3 30582 1726855369.83718: Calling groups_inventory to load vars for managed_node3 30582 1726855369.83723: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.83737: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.83741: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.83745: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.85445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.87148: done with get_vars() 30582 1726855369.87184: variable 'ansible_search_path' from source: unknown 30582 1726855369.87185: variable 'ansible_search_path' from source: unknown 30582 1726855369.87409: variable 'omit' from source: magic vars 30582 1726855369.87448: variable 'omit' from source: magic vars 30582 1726855369.87467: variable 'omit' from source: magic vars 30582 1726855369.87472: we have included files to process 30582 1726855369.87473: generating all_blocks data 30582 1726855369.87475: done generating all_blocks data 30582 1726855369.87476: processing included file: fedora.linux_system_roles.network 30582 1726855369.87501: in VariableManager get_vars() 30582 1726855369.87524: done with get_vars() 30582 1726855369.87552: in VariableManager get_vars() 30582 1726855369.87575: done with get_vars() 30582 1726855369.87621: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855369.87753: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855369.87846: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855369.88354: in VariableManager get_vars() 30582 1726855369.88383: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855369.90454: iterating over new_blocks loaded from include file 30582 1726855369.90456: in VariableManager get_vars() 30582 1726855369.90478: done with get_vars() 30582 1726855369.90480: filtering new block on tags 30582 1726855369.90855: done filtering new block on tags 30582 1726855369.90859: in VariableManager get_vars() 30582 1726855369.90877: done with get_vars() 30582 1726855369.90879: filtering new block on tags 30582 1726855369.90902: done filtering new block on tags 30582 1726855369.90904: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855369.90909: extending task lists for all hosts with included blocks 30582 1726855369.91076: done extending task lists 30582 1726855369.91078: done processing included files 30582 1726855369.91078: results queue empty 30582 1726855369.91079: checking for any_errors_fatal 30582 1726855369.91083: done checking for any_errors_fatal 30582 1726855369.91084: checking for max_fail_percentage 30582 1726855369.91085: done checking for max_fail_percentage 30582 1726855369.91086: checking to see if all hosts have failed and the running result is not ok 30582 1726855369.91086: done checking to see if all hosts have failed 30582 1726855369.91089: getting the remaining hosts for this loop 30582 1726855369.91090: done getting the remaining hosts for this loop 30582 1726855369.91093: getting the next task for host managed_node3 30582 1726855369.91097: done getting next task for host managed_node3 30582 1726855369.91100: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855369.91103: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855369.91118: getting variables 30582 1726855369.91119: in VariableManager get_vars() 30582 1726855369.91133: Calling all_inventory to load vars for managed_node3 30582 1726855369.91135: Calling groups_inventory to load vars for managed_node3 30582 1726855369.91137: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.91142: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.91145: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.91147: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.92385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855369.94026: done with get_vars() 30582 1726855369.94057: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:49 -0400 (0:00:00.122) 0:01:46.291 ****** 30582 1726855369.94141: entering _queue_task() for managed_node3/include_tasks 30582 1726855369.94585: worker is 1 (out of 1 available) 30582 1726855369.94798: exiting _queue_task() for managed_node3/include_tasks 30582 1726855369.94807: done queuing things up, now waiting for results queue to drain 30582 1726855369.94809: waiting for pending results... 30582 1726855369.94942: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855369.95147: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021a3 30582 1726855369.95151: variable 'ansible_search_path' from source: unknown 30582 1726855369.95154: variable 'ansible_search_path' from source: unknown 30582 1726855369.95171: calling self._execute() 30582 1726855369.95286: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855369.95301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855369.95316: variable 'omit' from source: magic vars 30582 1726855369.95894: variable 'ansible_distribution_major_version' from source: facts 30582 1726855369.95899: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855369.95901: _execute() done 30582 1726855369.95904: dumping result to json 30582 1726855369.95906: done dumping result, returning 30582 1726855369.95916: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-0000000021a3] 30582 1726855369.95919: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a3 30582 1726855369.96079: no more pending results, returning what we have 30582 1726855369.96086: in VariableManager get_vars() 30582 1726855369.96158: Calling all_inventory to load vars for managed_node3 30582 1726855369.96162: Calling groups_inventory to load vars for managed_node3 30582 1726855369.96167: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855369.96181: Calling all_plugins_play to load vars for managed_node3 30582 1726855369.96185: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855369.96192: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a3 30582 1726855369.96196: WORKER PROCESS EXITING 30582 1726855369.96294: Calling groups_plugins_play to load vars for managed_node3 30582 1726855369.98221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855370.00383: done with get_vars() 30582 1726855370.00608: variable 'ansible_search_path' from source: unknown 30582 1726855370.00610: variable 'ansible_search_path' from source: unknown 30582 1726855370.00654: we have included files to process 30582 1726855370.00655: generating all_blocks data 30582 1726855370.00657: done generating all_blocks data 30582 1726855370.00661: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855370.00662: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855370.00667: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855370.01671: done processing included file 30582 1726855370.01673: iterating over new_blocks loaded from include file 30582 1726855370.01675: in VariableManager get_vars() 30582 1726855370.01906: done with get_vars() 30582 1726855370.01909: filtering new block on tags 30582 1726855370.01947: done filtering new block on tags 30582 1726855370.01950: in VariableManager get_vars() 30582 1726855370.01979: done with get_vars() 30582 1726855370.01980: filtering new block on tags 30582 1726855370.02026: done filtering new block on tags 30582 1726855370.02029: in VariableManager get_vars() 30582 1726855370.02052: done with get_vars() 30582 1726855370.02054: filtering new block on tags 30582 1726855370.02307: done filtering new block on tags 30582 1726855370.02310: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855370.02315: extending task lists for all hosts with included blocks 30582 1726855370.06053: done extending task lists 30582 1726855370.06056: done processing included files 30582 1726855370.06057: results queue empty 30582 1726855370.06058: checking for any_errors_fatal 30582 1726855370.06061: done checking for any_errors_fatal 30582 1726855370.06062: checking for max_fail_percentage 30582 1726855370.06066: done checking for max_fail_percentage 30582 1726855370.06067: checking to see if all hosts have failed and the running result is not ok 30582 1726855370.06068: done checking to see if all hosts have failed 30582 1726855370.06069: getting the remaining hosts for this loop 30582 1726855370.06070: done getting the remaining hosts for this loop 30582 1726855370.06073: getting the next task for host managed_node3 30582 1726855370.06078: done getting next task for host managed_node3 30582 1726855370.06081: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855370.06086: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855370.06100: getting variables 30582 1726855370.06102: in VariableManager get_vars() 30582 1726855370.06123: Calling all_inventory to load vars for managed_node3 30582 1726855370.06126: Calling groups_inventory to load vars for managed_node3 30582 1726855370.06128: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855370.06134: Calling all_plugins_play to load vars for managed_node3 30582 1726855370.06136: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855370.06139: Calling groups_plugins_play to load vars for managed_node3 30582 1726855370.08623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855370.12240: done with get_vars() 30582 1726855370.12280: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:02:50 -0400 (0:00:00.182) 0:01:46.473 ****** 30582 1726855370.12373: entering _queue_task() for managed_node3/setup 30582 1726855370.13181: worker is 1 (out of 1 available) 30582 1726855370.13398: exiting _queue_task() for managed_node3/setup 30582 1726855370.13414: done queuing things up, now waiting for results queue to drain 30582 1726855370.13416: waiting for pending results... 30582 1726855370.14012: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855370.14377: in run() - task 0affcc66-ac2b-aa83-7d57-000000002200 30582 1726855370.14391: variable 'ansible_search_path' from source: unknown 30582 1726855370.14395: variable 'ansible_search_path' from source: unknown 30582 1726855370.14436: calling self._execute() 30582 1726855370.14953: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855370.14957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855370.14970: variable 'omit' from source: magic vars 30582 1726855370.16097: variable 'ansible_distribution_major_version' from source: facts 30582 1726855370.16122: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855370.16541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855370.19023: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855370.19174: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855370.19178: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855370.19180: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855370.19183: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855370.19368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855370.19401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855370.19422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855370.19463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855370.19477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855370.19811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855370.19844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855370.19870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855370.19934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855370.19937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855370.20257: variable '__network_required_facts' from source: role '' defaults 30582 1726855370.20263: variable 'ansible_facts' from source: unknown 30582 1726855370.21030: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855370.21034: when evaluation is False, skipping this task 30582 1726855370.21037: _execute() done 30582 1726855370.21039: dumping result to json 30582 1726855370.21042: done dumping result, returning 30582 1726855370.21044: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-000000002200] 30582 1726855370.21046: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002200 30582 1726855370.21321: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002200 30582 1726855370.21324: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855370.21367: no more pending results, returning what we have 30582 1726855370.21370: results queue empty 30582 1726855370.21371: checking for any_errors_fatal 30582 1726855370.21373: done checking for any_errors_fatal 30582 1726855370.21374: checking for max_fail_percentage 30582 1726855370.21375: done checking for max_fail_percentage 30582 1726855370.21376: checking to see if all hosts have failed and the running result is not ok 30582 1726855370.21377: done checking to see if all hosts have failed 30582 1726855370.21377: getting the remaining hosts for this loop 30582 1726855370.21379: done getting the remaining hosts for this loop 30582 1726855370.21382: getting the next task for host managed_node3 30582 1726855370.21393: done getting next task for host managed_node3 30582 1726855370.21396: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855370.21403: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855370.21421: getting variables 30582 1726855370.21423: in VariableManager get_vars() 30582 1726855370.21508: Calling all_inventory to load vars for managed_node3 30582 1726855370.21512: Calling groups_inventory to load vars for managed_node3 30582 1726855370.21514: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855370.21523: Calling all_plugins_play to load vars for managed_node3 30582 1726855370.21526: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855370.21534: Calling groups_plugins_play to load vars for managed_node3 30582 1726855370.24543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855370.27361: done with get_vars() 30582 1726855370.27397: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:02:50 -0400 (0:00:00.151) 0:01:46.624 ****** 30582 1726855370.27502: entering _queue_task() for managed_node3/stat 30582 1726855370.27876: worker is 1 (out of 1 available) 30582 1726855370.27892: exiting _queue_task() for managed_node3/stat 30582 1726855370.27904: done queuing things up, now waiting for results queue to drain 30582 1726855370.27906: waiting for pending results... 30582 1726855370.28277: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855370.28340: in run() - task 0affcc66-ac2b-aa83-7d57-000000002202 30582 1726855370.28593: variable 'ansible_search_path' from source: unknown 30582 1726855370.28597: variable 'ansible_search_path' from source: unknown 30582 1726855370.28600: calling self._execute() 30582 1726855370.28603: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855370.28606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855370.28609: variable 'omit' from source: magic vars 30582 1726855370.28898: variable 'ansible_distribution_major_version' from source: facts 30582 1726855370.28908: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855370.29192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855370.29351: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855370.29398: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855370.29431: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855370.29470: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855370.29640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855370.29646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855370.29649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855370.29651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855370.29715: variable '__network_is_ostree' from source: set_fact 30582 1726855370.29721: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855370.29725: when evaluation is False, skipping this task 30582 1726855370.29727: _execute() done 30582 1726855370.29730: dumping result to json 30582 1726855370.29734: done dumping result, returning 30582 1726855370.29745: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-000000002202] 30582 1726855370.29748: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002202 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855370.29897: no more pending results, returning what we have 30582 1726855370.29902: results queue empty 30582 1726855370.29904: checking for any_errors_fatal 30582 1726855370.29918: done checking for any_errors_fatal 30582 1726855370.29919: checking for max_fail_percentage 30582 1726855370.29922: done checking for max_fail_percentage 30582 1726855370.29923: checking to see if all hosts have failed and the running result is not ok 30582 1726855370.29924: done checking to see if all hosts have failed 30582 1726855370.29925: getting the remaining hosts for this loop 30582 1726855370.29927: done getting the remaining hosts for this loop 30582 1726855370.29931: getting the next task for host managed_node3 30582 1726855370.29940: done getting next task for host managed_node3 30582 1726855370.29944: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855370.29949: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855370.29975: getting variables 30582 1726855370.29978: in VariableManager get_vars() 30582 1726855370.30254: Calling all_inventory to load vars for managed_node3 30582 1726855370.30257: Calling groups_inventory to load vars for managed_node3 30582 1726855370.30260: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855370.30273: Calling all_plugins_play to load vars for managed_node3 30582 1726855370.30277: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855370.30281: Calling groups_plugins_play to load vars for managed_node3 30582 1726855370.31065: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002202 30582 1726855370.31070: WORKER PROCESS EXITING 30582 1726855370.32047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855370.33756: done with get_vars() 30582 1726855370.33780: done getting variables 30582 1726855370.33843: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:02:50 -0400 (0:00:00.063) 0:01:46.688 ****** 30582 1726855370.33883: entering _queue_task() for managed_node3/set_fact 30582 1726855370.34552: worker is 1 (out of 1 available) 30582 1726855370.34567: exiting _queue_task() for managed_node3/set_fact 30582 1726855370.34583: done queuing things up, now waiting for results queue to drain 30582 1726855370.34585: waiting for pending results... 30582 1726855370.35110: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855370.35338: in run() - task 0affcc66-ac2b-aa83-7d57-000000002203 30582 1726855370.35448: variable 'ansible_search_path' from source: unknown 30582 1726855370.35454: variable 'ansible_search_path' from source: unknown 30582 1726855370.35456: calling self._execute() 30582 1726855370.35523: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855370.35534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855370.35553: variable 'omit' from source: magic vars 30582 1726855370.35950: variable 'ansible_distribution_major_version' from source: facts 30582 1726855370.35966: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855370.36145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855370.36450: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855370.36530: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855370.36543: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855370.36586: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855370.36684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855370.36747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855370.36751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855370.36789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855370.36890: variable '__network_is_ostree' from source: set_fact 30582 1726855370.36903: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855370.36965: when evaluation is False, skipping this task 30582 1726855370.36968: _execute() done 30582 1726855370.36970: dumping result to json 30582 1726855370.36973: done dumping result, returning 30582 1726855370.36975: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-000000002203] 30582 1726855370.36977: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002203 30582 1726855370.37048: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002203 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855370.37105: no more pending results, returning what we have 30582 1726855370.37110: results queue empty 30582 1726855370.37111: checking for any_errors_fatal 30582 1726855370.37117: done checking for any_errors_fatal 30582 1726855370.37118: checking for max_fail_percentage 30582 1726855370.37121: done checking for max_fail_percentage 30582 1726855370.37122: checking to see if all hosts have failed and the running result is not ok 30582 1726855370.37123: done checking to see if all hosts have failed 30582 1726855370.37124: getting the remaining hosts for this loop 30582 1726855370.37126: done getting the remaining hosts for this loop 30582 1726855370.37130: getting the next task for host managed_node3 30582 1726855370.37142: done getting next task for host managed_node3 30582 1726855370.37147: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855370.37153: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855370.37294: getting variables 30582 1726855370.37296: in VariableManager get_vars() 30582 1726855370.37348: Calling all_inventory to load vars for managed_node3 30582 1726855370.37351: Calling groups_inventory to load vars for managed_node3 30582 1726855370.37354: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855370.37365: Calling all_plugins_play to load vars for managed_node3 30582 1726855370.37369: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855370.37374: Calling groups_plugins_play to load vars for managed_node3 30582 1726855370.37901: WORKER PROCESS EXITING 30582 1726855370.39008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855370.40641: done with get_vars() 30582 1726855370.40671: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:02:50 -0400 (0:00:00.068) 0:01:46.757 ****** 30582 1726855370.40779: entering _queue_task() for managed_node3/service_facts 30582 1726855370.41142: worker is 1 (out of 1 available) 30582 1726855370.41156: exiting _queue_task() for managed_node3/service_facts 30582 1726855370.41169: done queuing things up, now waiting for results queue to drain 30582 1726855370.41171: waiting for pending results... 30582 1726855370.41475: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855370.41641: in run() - task 0affcc66-ac2b-aa83-7d57-000000002205 30582 1726855370.41663: variable 'ansible_search_path' from source: unknown 30582 1726855370.41672: variable 'ansible_search_path' from source: unknown 30582 1726855370.41718: calling self._execute() 30582 1726855370.41820: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855370.41941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855370.41951: variable 'omit' from source: magic vars 30582 1726855370.42250: variable 'ansible_distribution_major_version' from source: facts 30582 1726855370.42274: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855370.42292: variable 'omit' from source: magic vars 30582 1726855370.42381: variable 'omit' from source: magic vars 30582 1726855370.42431: variable 'omit' from source: magic vars 30582 1726855370.42483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855370.42594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855370.42597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855370.42600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855370.42604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855370.42621: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855370.42630: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855370.42638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855370.42761: Set connection var ansible_timeout to 10 30582 1726855370.42769: Set connection var ansible_connection to ssh 30582 1726855370.42781: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855370.42794: Set connection var ansible_pipelining to False 30582 1726855370.42811: Set connection var ansible_shell_executable to /bin/sh 30582 1726855370.42819: Set connection var ansible_shell_type to sh 30582 1726855370.42849: variable 'ansible_shell_executable' from source: unknown 30582 1726855370.42857: variable 'ansible_connection' from source: unknown 30582 1726855370.42919: variable 'ansible_module_compression' from source: unknown 30582 1726855370.42922: variable 'ansible_shell_type' from source: unknown 30582 1726855370.42925: variable 'ansible_shell_executable' from source: unknown 30582 1726855370.42927: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855370.42929: variable 'ansible_pipelining' from source: unknown 30582 1726855370.42934: variable 'ansible_timeout' from source: unknown 30582 1726855370.42937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855370.43106: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855370.43123: variable 'omit' from source: magic vars 30582 1726855370.43137: starting attempt loop 30582 1726855370.43144: running the handler 30582 1726855370.43168: _low_level_execute_command(): starting 30582 1726855370.43181: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855370.43974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855370.43994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855370.44008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855370.44063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855370.44242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855370.44257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855370.44364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855370.46058: stdout chunk (state=3): >>>/root <<< 30582 1726855370.46194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855370.46216: stdout chunk (state=3): >>><<< 30582 1726855370.46232: stderr chunk (state=3): >>><<< 30582 1726855370.46357: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855370.46361: _low_level_execute_command(): starting 30582 1726855370.46367: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915 `" && echo ansible-tmp-1726855370.4625776-35573-185943698005915="` echo /root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915 `" ) && sleep 0' 30582 1726855370.46927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855370.46940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855370.47006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855370.47078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855370.47106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855370.47196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855370.49138: stdout chunk (state=3): >>>ansible-tmp-1726855370.4625776-35573-185943698005915=/root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915 <<< 30582 1726855370.49304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855370.49308: stdout chunk (state=3): >>><<< 30582 1726855370.49310: stderr chunk (state=3): >>><<< 30582 1726855370.49333: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855370.4625776-35573-185943698005915=/root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855370.49389: variable 'ansible_module_compression' from source: unknown 30582 1726855370.49444: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855370.49489: variable 'ansible_facts' from source: unknown 30582 1726855370.49674: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/AnsiballZ_service_facts.py 30582 1726855370.49807: Sending initial data 30582 1726855370.49810: Sent initial data (162 bytes) 30582 1726855370.50376: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855370.50442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855370.50504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855370.50528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855370.50554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855370.50642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855370.52213: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855370.52286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855370.52372: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_ubz6sal /root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/AnsiballZ_service_facts.py <<< 30582 1726855370.52383: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/AnsiballZ_service_facts.py" <<< 30582 1726855370.52437: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp_ubz6sal" to remote "/root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/AnsiballZ_service_facts.py" <<< 30582 1726855370.53354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855370.53361: stderr chunk (state=3): >>><<< 30582 1726855370.53470: stdout chunk (state=3): >>><<< 30582 1726855370.53474: done transferring module to remote 30582 1726855370.53476: _low_level_execute_command(): starting 30582 1726855370.53479: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/ /root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/AnsiballZ_service_facts.py && sleep 0' 30582 1726855370.54033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855370.54046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855370.54062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855370.54081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855370.54117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855370.54125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855370.54194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855370.54201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855370.54271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855370.56093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855370.56096: stdout chunk (state=3): >>><<< 30582 1726855370.56111: stderr chunk (state=3): >>><<< 30582 1726855370.56200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855370.56204: _low_level_execute_command(): starting 30582 1726855370.56207: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/AnsiballZ_service_facts.py && sleep 0' 30582 1726855370.56810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855370.56813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855370.56816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855370.56818: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855370.56820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855370.56855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855370.56895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855370.57002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855372.08502: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855372.09734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855372.09752: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 30582 1726855372.09817: stderr chunk (state=3): >>><<< 30582 1726855372.09846: stdout chunk (state=3): >>><<< 30582 1726855372.09881: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855372.11149: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855372.11168: _low_level_execute_command(): starting 30582 1726855372.11260: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855370.4625776-35573-185943698005915/ > /dev/null 2>&1 && sleep 0' 30582 1726855372.11879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855372.11903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855372.11917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855372.11941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855372.11957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855372.11980: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855372.12019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855372.12135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855372.12155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855372.12254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855372.14129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855372.14139: stdout chunk (state=3): >>><<< 30582 1726855372.14155: stderr chunk (state=3): >>><<< 30582 1726855372.14184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855372.14392: handler run complete 30582 1726855372.14408: variable 'ansible_facts' from source: unknown 30582 1726855372.14657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855372.15534: variable 'ansible_facts' from source: unknown 30582 1726855372.15684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855372.16094: attempt loop complete, returning result 30582 1726855372.16106: _execute() done 30582 1726855372.16113: dumping result to json 30582 1726855372.16313: done dumping result, returning 30582 1726855372.16353: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000002205] 30582 1726855372.16371: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002205 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855372.18836: no more pending results, returning what we have 30582 1726855372.18839: results queue empty 30582 1726855372.18840: checking for any_errors_fatal 30582 1726855372.18843: done checking for any_errors_fatal 30582 1726855372.18843: checking for max_fail_percentage 30582 1726855372.18884: done checking for max_fail_percentage 30582 1726855372.18886: checking to see if all hosts have failed and the running result is not ok 30582 1726855372.18889: done checking to see if all hosts have failed 30582 1726855372.18890: getting the remaining hosts for this loop 30582 1726855372.18891: done getting the remaining hosts for this loop 30582 1726855372.18895: getting the next task for host managed_node3 30582 1726855372.18902: done getting next task for host managed_node3 30582 1726855372.18905: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855372.18911: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855372.18920: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002205 30582 1726855372.18922: WORKER PROCESS EXITING 30582 1726855372.18932: getting variables 30582 1726855372.18933: in VariableManager get_vars() 30582 1726855372.19000: Calling all_inventory to load vars for managed_node3 30582 1726855372.19003: Calling groups_inventory to load vars for managed_node3 30582 1726855372.19006: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855372.19015: Calling all_plugins_play to load vars for managed_node3 30582 1726855372.19021: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855372.19024: Calling groups_plugins_play to load vars for managed_node3 30582 1726855372.20647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855372.23425: done with get_vars() 30582 1726855372.23463: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:02:52 -0400 (0:00:01.829) 0:01:48.587 ****** 30582 1726855372.23734: entering _queue_task() for managed_node3/package_facts 30582 1726855372.24905: worker is 1 (out of 1 available) 30582 1726855372.24920: exiting _queue_task() for managed_node3/package_facts 30582 1726855372.24932: done queuing things up, now waiting for results queue to drain 30582 1726855372.24934: waiting for pending results... 30582 1726855372.25466: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855372.25668: in run() - task 0affcc66-ac2b-aa83-7d57-000000002206 30582 1726855372.25673: variable 'ansible_search_path' from source: unknown 30582 1726855372.25676: variable 'ansible_search_path' from source: unknown 30582 1726855372.25774: calling self._execute() 30582 1726855372.25896: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855372.25913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855372.25929: variable 'omit' from source: magic vars 30582 1726855372.26359: variable 'ansible_distribution_major_version' from source: facts 30582 1726855372.26379: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855372.26392: variable 'omit' from source: magic vars 30582 1726855372.26482: variable 'omit' from source: magic vars 30582 1726855372.26520: variable 'omit' from source: magic vars 30582 1726855372.26638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855372.26641: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855372.26644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855372.26661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855372.26680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855372.26717: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855372.26726: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855372.26733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855372.26844: Set connection var ansible_timeout to 10 30582 1726855372.26857: Set connection var ansible_connection to ssh 30582 1726855372.26877: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855372.26886: Set connection var ansible_pipelining to False 30582 1726855372.26897: Set connection var ansible_shell_executable to /bin/sh 30582 1726855372.26903: Set connection var ansible_shell_type to sh 30582 1726855372.26929: variable 'ansible_shell_executable' from source: unknown 30582 1726855372.26966: variable 'ansible_connection' from source: unknown 30582 1726855372.26975: variable 'ansible_module_compression' from source: unknown 30582 1726855372.26977: variable 'ansible_shell_type' from source: unknown 30582 1726855372.26979: variable 'ansible_shell_executable' from source: unknown 30582 1726855372.26981: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855372.26983: variable 'ansible_pipelining' from source: unknown 30582 1726855372.26985: variable 'ansible_timeout' from source: unknown 30582 1726855372.26988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855372.27292: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855372.27299: variable 'omit' from source: magic vars 30582 1726855372.27301: starting attempt loop 30582 1726855372.27303: running the handler 30582 1726855372.27305: _low_level_execute_command(): starting 30582 1726855372.27307: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855372.28155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855372.28192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855372.28352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855372.30000: stdout chunk (state=3): >>>/root <<< 30582 1726855372.30412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855372.30419: stdout chunk (state=3): >>><<< 30582 1726855372.30423: stderr chunk (state=3): >>><<< 30582 1726855372.30427: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855372.30430: _low_level_execute_command(): starting 30582 1726855372.30433: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652 `" && echo ansible-tmp-1726855372.3031547-35632-70157405483652="` echo /root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652 `" ) && sleep 0' 30582 1726855372.31367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855372.31383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855372.31441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855372.31661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855372.31723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855372.31821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855372.33744: stdout chunk (state=3): >>>ansible-tmp-1726855372.3031547-35632-70157405483652=/root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652 <<< 30582 1726855372.33882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855372.33895: stdout chunk (state=3): >>><<< 30582 1726855372.33907: stderr chunk (state=3): >>><<< 30582 1726855372.33928: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855372.3031547-35632-70157405483652=/root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855372.34048: variable 'ansible_module_compression' from source: unknown 30582 1726855372.34057: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855372.34128: variable 'ansible_facts' from source: unknown 30582 1726855372.34346: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/AnsiballZ_package_facts.py 30582 1726855372.34596: Sending initial data 30582 1726855372.34599: Sent initial data (161 bytes) 30582 1726855372.35268: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855372.35446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855372.35593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855372.35678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855372.37319: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855372.37403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855372.37482: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp5pezqmoc /root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/AnsiballZ_package_facts.py <<< 30582 1726855372.37486: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/AnsiballZ_package_facts.py" <<< 30582 1726855372.37550: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp5pezqmoc" to remote "/root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/AnsiballZ_package_facts.py" <<< 30582 1726855372.39792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855372.39796: stdout chunk (state=3): >>><<< 30582 1726855372.39799: stderr chunk (state=3): >>><<< 30582 1726855372.40000: done transferring module to remote 30582 1726855372.40004: _low_level_execute_command(): starting 30582 1726855372.40006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/ /root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/AnsiballZ_package_facts.py && sleep 0' 30582 1726855372.40961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855372.40965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855372.40967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855372.40970: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855372.40972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855372.41292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855372.41402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855372.41499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855372.43322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855372.43363: stderr chunk (state=3): >>><<< 30582 1726855372.43371: stdout chunk (state=3): >>><<< 30582 1726855372.43472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855372.43476: _low_level_execute_command(): starting 30582 1726855372.43479: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/AnsiballZ_package_facts.py && sleep 0' 30582 1726855372.44554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855372.44561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855372.44563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855372.44566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855372.44568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855372.44908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855372.44942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855372.45020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855372.89245: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30582 1726855372.89358: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30582 1726855372.89471: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30582 1726855372.89479: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30582 1726855372.89485: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30582 1726855372.89512: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855372.91275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855372.91279: stdout chunk (state=3): >>><<< 30582 1726855372.91282: stderr chunk (state=3): >>><<< 30582 1726855372.91486: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855372.93928: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855372.93958: _low_level_execute_command(): starting 30582 1726855372.93969: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855372.3031547-35632-70157405483652/ > /dev/null 2>&1 && sleep 0' 30582 1726855372.94603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855372.94617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855372.94631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855372.94648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855372.94691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855372.94765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855372.94799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855372.94824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855372.94916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855372.96828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855372.96832: stdout chunk (state=3): >>><<< 30582 1726855372.96834: stderr chunk (state=3): >>><<< 30582 1726855372.96849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855372.96993: handler run complete 30582 1726855372.97734: variable 'ansible_facts' from source: unknown 30582 1726855372.98289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.07954: variable 'ansible_facts' from source: unknown 30582 1726855373.08618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.09453: attempt loop complete, returning result 30582 1726855373.09471: _execute() done 30582 1726855373.09482: dumping result to json 30582 1726855373.09681: done dumping result, returning 30582 1726855373.09701: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000002206] 30582 1726855373.09710: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002206 30582 1726855373.12873: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002206 30582 1726855373.12877: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855373.13024: no more pending results, returning what we have 30582 1726855373.13028: results queue empty 30582 1726855373.13029: checking for any_errors_fatal 30582 1726855373.13033: done checking for any_errors_fatal 30582 1726855373.13034: checking for max_fail_percentage 30582 1726855373.13036: done checking for max_fail_percentage 30582 1726855373.13036: checking to see if all hosts have failed and the running result is not ok 30582 1726855373.13037: done checking to see if all hosts have failed 30582 1726855373.13038: getting the remaining hosts for this loop 30582 1726855373.13039: done getting the remaining hosts for this loop 30582 1726855373.13043: getting the next task for host managed_node3 30582 1726855373.13051: done getting next task for host managed_node3 30582 1726855373.13055: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855373.13062: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855373.13078: getting variables 30582 1726855373.13079: in VariableManager get_vars() 30582 1726855373.13119: Calling all_inventory to load vars for managed_node3 30582 1726855373.13123: Calling groups_inventory to load vars for managed_node3 30582 1726855373.13125: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855373.13134: Calling all_plugins_play to load vars for managed_node3 30582 1726855373.13137: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855373.13140: Calling groups_plugins_play to load vars for managed_node3 30582 1726855373.22500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.24186: done with get_vars() 30582 1726855373.24231: done getting variables 30582 1726855373.24284: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:53 -0400 (0:00:01.005) 0:01:49.593 ****** 30582 1726855373.24320: entering _queue_task() for managed_node3/debug 30582 1726855373.24706: worker is 1 (out of 1 available) 30582 1726855373.24719: exiting _queue_task() for managed_node3/debug 30582 1726855373.24730: done queuing things up, now waiting for results queue to drain 30582 1726855373.24732: waiting for pending results... 30582 1726855373.25120: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855373.25324: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021a4 30582 1726855373.25329: variable 'ansible_search_path' from source: unknown 30582 1726855373.25333: variable 'ansible_search_path' from source: unknown 30582 1726855373.25337: calling self._execute() 30582 1726855373.25417: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.25435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.25449: variable 'omit' from source: magic vars 30582 1726855373.25948: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.25965: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855373.26006: variable 'omit' from source: magic vars 30582 1726855373.26167: variable 'omit' from source: magic vars 30582 1726855373.26297: variable 'network_provider' from source: set_fact 30582 1726855373.26319: variable 'omit' from source: magic vars 30582 1726855373.26380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855373.26430: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855373.26513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855373.26518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855373.26521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855373.26537: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855373.26546: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.26555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.26667: Set connection var ansible_timeout to 10 30582 1726855373.26675: Set connection var ansible_connection to ssh 30582 1726855373.26689: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855373.26700: Set connection var ansible_pipelining to False 30582 1726855373.26709: Set connection var ansible_shell_executable to /bin/sh 30582 1726855373.26729: Set connection var ansible_shell_type to sh 30582 1726855373.26748: variable 'ansible_shell_executable' from source: unknown 30582 1726855373.26792: variable 'ansible_connection' from source: unknown 30582 1726855373.26795: variable 'ansible_module_compression' from source: unknown 30582 1726855373.26798: variable 'ansible_shell_type' from source: unknown 30582 1726855373.26800: variable 'ansible_shell_executable' from source: unknown 30582 1726855373.26802: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.26805: variable 'ansible_pipelining' from source: unknown 30582 1726855373.26807: variable 'ansible_timeout' from source: unknown 30582 1726855373.26810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.26938: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855373.26959: variable 'omit' from source: magic vars 30582 1726855373.26993: starting attempt loop 30582 1726855373.26996: running the handler 30582 1726855373.27027: handler run complete 30582 1726855373.27045: attempt loop complete, returning result 30582 1726855373.27057: _execute() done 30582 1726855373.27193: dumping result to json 30582 1726855373.27197: done dumping result, returning 30582 1726855373.27200: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-0000000021a4] 30582 1726855373.27203: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a4 30582 1726855373.27279: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a4 30582 1726855373.27282: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855373.27354: no more pending results, returning what we have 30582 1726855373.27358: results queue empty 30582 1726855373.27359: checking for any_errors_fatal 30582 1726855373.27375: done checking for any_errors_fatal 30582 1726855373.27376: checking for max_fail_percentage 30582 1726855373.27378: done checking for max_fail_percentage 30582 1726855373.27379: checking to see if all hosts have failed and the running result is not ok 30582 1726855373.27380: done checking to see if all hosts have failed 30582 1726855373.27381: getting the remaining hosts for this loop 30582 1726855373.27383: done getting the remaining hosts for this loop 30582 1726855373.27388: getting the next task for host managed_node3 30582 1726855373.27398: done getting next task for host managed_node3 30582 1726855373.27403: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855373.27409: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855373.27425: getting variables 30582 1726855373.27427: in VariableManager get_vars() 30582 1726855373.27481: Calling all_inventory to load vars for managed_node3 30582 1726855373.27485: Calling groups_inventory to load vars for managed_node3 30582 1726855373.27695: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855373.27707: Calling all_plugins_play to load vars for managed_node3 30582 1726855373.27711: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855373.27714: Calling groups_plugins_play to load vars for managed_node3 30582 1726855373.29211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.31215: done with get_vars() 30582 1726855373.31249: done getting variables 30582 1726855373.31332: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:53 -0400 (0:00:00.070) 0:01:49.663 ****** 30582 1726855373.31379: entering _queue_task() for managed_node3/fail 30582 1726855373.31750: worker is 1 (out of 1 available) 30582 1726855373.31764: exiting _queue_task() for managed_node3/fail 30582 1726855373.31775: done queuing things up, now waiting for results queue to drain 30582 1726855373.31777: waiting for pending results... 30582 1726855373.32205: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855373.32211: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021a5 30582 1726855373.32228: variable 'ansible_search_path' from source: unknown 30582 1726855373.32235: variable 'ansible_search_path' from source: unknown 30582 1726855373.32274: calling self._execute() 30582 1726855373.32382: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.32394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.32410: variable 'omit' from source: magic vars 30582 1726855373.32782: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.32801: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855373.32936: variable 'network_state' from source: role '' defaults 30582 1726855373.32957: Evaluated conditional (network_state != {}): False 30582 1726855373.32966: when evaluation is False, skipping this task 30582 1726855373.33062: _execute() done 30582 1726855373.33065: dumping result to json 30582 1726855373.33068: done dumping result, returning 30582 1726855373.33071: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-0000000021a5] 30582 1726855373.33075: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a5 30582 1726855373.33152: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a5 30582 1726855373.33156: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855373.33213: no more pending results, returning what we have 30582 1726855373.33218: results queue empty 30582 1726855373.33219: checking for any_errors_fatal 30582 1726855373.33229: done checking for any_errors_fatal 30582 1726855373.33229: checking for max_fail_percentage 30582 1726855373.33232: done checking for max_fail_percentage 30582 1726855373.33233: checking to see if all hosts have failed and the running result is not ok 30582 1726855373.33233: done checking to see if all hosts have failed 30582 1726855373.33234: getting the remaining hosts for this loop 30582 1726855373.33236: done getting the remaining hosts for this loop 30582 1726855373.33240: getting the next task for host managed_node3 30582 1726855373.33249: done getting next task for host managed_node3 30582 1726855373.33252: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855373.33258: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855373.33286: getting variables 30582 1726855373.33290: in VariableManager get_vars() 30582 1726855373.33340: Calling all_inventory to load vars for managed_node3 30582 1726855373.33343: Calling groups_inventory to load vars for managed_node3 30582 1726855373.33346: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855373.33358: Calling all_plugins_play to load vars for managed_node3 30582 1726855373.33362: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855373.33365: Calling groups_plugins_play to load vars for managed_node3 30582 1726855373.35003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.36607: done with get_vars() 30582 1726855373.36632: done getting variables 30582 1726855373.36693: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:53 -0400 (0:00:00.053) 0:01:49.717 ****** 30582 1726855373.36730: entering _queue_task() for managed_node3/fail 30582 1726855373.37221: worker is 1 (out of 1 available) 30582 1726855373.37232: exiting _queue_task() for managed_node3/fail 30582 1726855373.37243: done queuing things up, now waiting for results queue to drain 30582 1726855373.37245: waiting for pending results... 30582 1726855373.37483: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855373.37594: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021a6 30582 1726855373.37614: variable 'ansible_search_path' from source: unknown 30582 1726855373.37621: variable 'ansible_search_path' from source: unknown 30582 1726855373.37662: calling self._execute() 30582 1726855373.37794: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.37798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.37801: variable 'omit' from source: magic vars 30582 1726855373.38166: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.38185: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855373.38315: variable 'network_state' from source: role '' defaults 30582 1726855373.38493: Evaluated conditional (network_state != {}): False 30582 1726855373.38497: when evaluation is False, skipping this task 30582 1726855373.38500: _execute() done 30582 1726855373.38502: dumping result to json 30582 1726855373.38505: done dumping result, returning 30582 1726855373.38508: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-0000000021a6] 30582 1726855373.38511: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a6 30582 1726855373.38586: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a6 30582 1726855373.38592: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855373.38643: no more pending results, returning what we have 30582 1726855373.38648: results queue empty 30582 1726855373.38649: checking for any_errors_fatal 30582 1726855373.38660: done checking for any_errors_fatal 30582 1726855373.38661: checking for max_fail_percentage 30582 1726855373.38663: done checking for max_fail_percentage 30582 1726855373.38664: checking to see if all hosts have failed and the running result is not ok 30582 1726855373.38665: done checking to see if all hosts have failed 30582 1726855373.38666: getting the remaining hosts for this loop 30582 1726855373.38667: done getting the remaining hosts for this loop 30582 1726855373.38671: getting the next task for host managed_node3 30582 1726855373.38680: done getting next task for host managed_node3 30582 1726855373.38685: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855373.38693: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855373.38721: getting variables 30582 1726855373.38723: in VariableManager get_vars() 30582 1726855373.38775: Calling all_inventory to load vars for managed_node3 30582 1726855373.38778: Calling groups_inventory to load vars for managed_node3 30582 1726855373.38780: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855373.38994: Calling all_plugins_play to load vars for managed_node3 30582 1726855373.38998: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855373.39002: Calling groups_plugins_play to load vars for managed_node3 30582 1726855373.40542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.42046: done with get_vars() 30582 1726855373.42073: done getting variables 30582 1726855373.42135: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:53 -0400 (0:00:00.054) 0:01:49.771 ****** 30582 1726855373.42174: entering _queue_task() for managed_node3/fail 30582 1726855373.42523: worker is 1 (out of 1 available) 30582 1726855373.42537: exiting _queue_task() for managed_node3/fail 30582 1726855373.42549: done queuing things up, now waiting for results queue to drain 30582 1726855373.42550: waiting for pending results... 30582 1726855373.42910: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855373.42993: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021a7 30582 1726855373.43014: variable 'ansible_search_path' from source: unknown 30582 1726855373.43022: variable 'ansible_search_path' from source: unknown 30582 1726855373.43062: calling self._execute() 30582 1726855373.43167: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.43178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.43194: variable 'omit' from source: magic vars 30582 1726855373.43591: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.43608: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855373.43786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855373.46156: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855373.46161: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855373.46186: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855373.46226: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855373.46258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855373.46342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.46379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.46414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.46458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.46480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.46591: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.46792: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855373.46796: variable 'ansible_distribution' from source: facts 30582 1726855373.46799: variable '__network_rh_distros' from source: role '' defaults 30582 1726855373.46801: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855373.47014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.47047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.47076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.47123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.47147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.47199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.47228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.47262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.47307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.47327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.47375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.47405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.47432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.47479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.47499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.47826: variable 'network_connections' from source: include params 30582 1726855373.47842: variable 'interface' from source: play vars 30582 1726855373.47915: variable 'interface' from source: play vars 30582 1726855373.47933: variable 'network_state' from source: role '' defaults 30582 1726855373.48002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855373.48194: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855373.48223: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855373.48259: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855373.48318: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855373.48345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855373.48374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855373.48427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.48536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855373.48540: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855373.48542: when evaluation is False, skipping this task 30582 1726855373.48545: _execute() done 30582 1726855373.48547: dumping result to json 30582 1726855373.48550: done dumping result, returning 30582 1726855373.48552: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-0000000021a7] 30582 1726855373.48555: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a7 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855373.48740: no more pending results, returning what we have 30582 1726855373.48745: results queue empty 30582 1726855373.48746: checking for any_errors_fatal 30582 1726855373.48754: done checking for any_errors_fatal 30582 1726855373.48755: checking for max_fail_percentage 30582 1726855373.48757: done checking for max_fail_percentage 30582 1726855373.48758: checking to see if all hosts have failed and the running result is not ok 30582 1726855373.48759: done checking to see if all hosts have failed 30582 1726855373.48760: getting the remaining hosts for this loop 30582 1726855373.48761: done getting the remaining hosts for this loop 30582 1726855373.48765: getting the next task for host managed_node3 30582 1726855373.48774: done getting next task for host managed_node3 30582 1726855373.48778: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855373.48784: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855373.48802: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a7 30582 1726855373.48805: WORKER PROCESS EXITING 30582 1726855373.48821: getting variables 30582 1726855373.48823: in VariableManager get_vars() 30582 1726855373.48874: Calling all_inventory to load vars for managed_node3 30582 1726855373.48877: Calling groups_inventory to load vars for managed_node3 30582 1726855373.48880: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855373.48995: Calling all_plugins_play to load vars for managed_node3 30582 1726855373.49000: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855373.49004: Calling groups_plugins_play to load vars for managed_node3 30582 1726855373.50589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.52148: done with get_vars() 30582 1726855373.52175: done getting variables 30582 1726855373.52237: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:53 -0400 (0:00:00.100) 0:01:49.872 ****** 30582 1726855373.52276: entering _queue_task() for managed_node3/dnf 30582 1726855373.52638: worker is 1 (out of 1 available) 30582 1726855373.52650: exiting _queue_task() for managed_node3/dnf 30582 1726855373.52663: done queuing things up, now waiting for results queue to drain 30582 1726855373.52665: waiting for pending results... 30582 1726855373.52985: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855373.53294: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021a8 30582 1726855373.53299: variable 'ansible_search_path' from source: unknown 30582 1726855373.53302: variable 'ansible_search_path' from source: unknown 30582 1726855373.53305: calling self._execute() 30582 1726855373.53318: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.53329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.53342: variable 'omit' from source: magic vars 30582 1726855373.53726: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.53747: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855373.53947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855373.56500: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855373.56577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855373.56635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855373.56681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855373.56715: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855373.56801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.56835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.56864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.56920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.57006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.57066: variable 'ansible_distribution' from source: facts 30582 1726855373.57076: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.57098: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855373.57228: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855373.57363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.57394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.57423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.57472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.57493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.57537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.57655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.57659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.57661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.57663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.57700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.57727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.57754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.57800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.57818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.57976: variable 'network_connections' from source: include params 30582 1726855373.57997: variable 'interface' from source: play vars 30582 1726855373.58061: variable 'interface' from source: play vars 30582 1726855373.58141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855373.58310: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855373.58494: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855373.58497: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855373.58499: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855373.58502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855373.58504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855373.58535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.58565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855373.58630: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855373.58871: variable 'network_connections' from source: include params 30582 1726855373.58882: variable 'interface' from source: play vars 30582 1726855373.58946: variable 'interface' from source: play vars 30582 1726855373.58986: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855373.58997: when evaluation is False, skipping this task 30582 1726855373.59004: _execute() done 30582 1726855373.59011: dumping result to json 30582 1726855373.59018: done dumping result, returning 30582 1726855373.59029: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000021a8] 30582 1726855373.59062: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a8 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855373.59219: no more pending results, returning what we have 30582 1726855373.59223: results queue empty 30582 1726855373.59224: checking for any_errors_fatal 30582 1726855373.59232: done checking for any_errors_fatal 30582 1726855373.59233: checking for max_fail_percentage 30582 1726855373.59236: done checking for max_fail_percentage 30582 1726855373.59237: checking to see if all hosts have failed and the running result is not ok 30582 1726855373.59238: done checking to see if all hosts have failed 30582 1726855373.59238: getting the remaining hosts for this loop 30582 1726855373.59240: done getting the remaining hosts for this loop 30582 1726855373.59244: getting the next task for host managed_node3 30582 1726855373.59252: done getting next task for host managed_node3 30582 1726855373.59255: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855373.59261: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855373.59284: getting variables 30582 1726855373.59286: in VariableManager get_vars() 30582 1726855373.59334: Calling all_inventory to load vars for managed_node3 30582 1726855373.59337: Calling groups_inventory to load vars for managed_node3 30582 1726855373.59339: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855373.59351: Calling all_plugins_play to load vars for managed_node3 30582 1726855373.59354: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855373.59357: Calling groups_plugins_play to load vars for managed_node3 30582 1726855373.60101: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a8 30582 1726855373.60104: WORKER PROCESS EXITING 30582 1726855373.61209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.62775: done with get_vars() 30582 1726855373.62800: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855373.62875: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:53 -0400 (0:00:00.106) 0:01:49.979 ****** 30582 1726855373.62911: entering _queue_task() for managed_node3/yum 30582 1726855373.63256: worker is 1 (out of 1 available) 30582 1726855373.63270: exiting _queue_task() for managed_node3/yum 30582 1726855373.63282: done queuing things up, now waiting for results queue to drain 30582 1726855373.63283: waiting for pending results... 30582 1726855373.63579: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855373.63738: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021a9 30582 1726855373.63758: variable 'ansible_search_path' from source: unknown 30582 1726855373.63765: variable 'ansible_search_path' from source: unknown 30582 1726855373.63806: calling self._execute() 30582 1726855373.63913: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.63928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.63941: variable 'omit' from source: magic vars 30582 1726855373.64321: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.64337: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855373.64520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855373.66784: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855373.66868: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855373.66915: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855373.66952: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855373.66985: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855373.67069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.67105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.67134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.67180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.67201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.67301: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.67394: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855373.67397: when evaluation is False, skipping this task 30582 1726855373.67400: _execute() done 30582 1726855373.67402: dumping result to json 30582 1726855373.67404: done dumping result, returning 30582 1726855373.67407: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000021a9] 30582 1726855373.67409: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a9 30582 1726855373.67483: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021a9 30582 1726855373.67486: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855373.67547: no more pending results, returning what we have 30582 1726855373.67551: results queue empty 30582 1726855373.67552: checking for any_errors_fatal 30582 1726855373.67560: done checking for any_errors_fatal 30582 1726855373.67560: checking for max_fail_percentage 30582 1726855373.67563: done checking for max_fail_percentage 30582 1726855373.67564: checking to see if all hosts have failed and the running result is not ok 30582 1726855373.67564: done checking to see if all hosts have failed 30582 1726855373.67565: getting the remaining hosts for this loop 30582 1726855373.67567: done getting the remaining hosts for this loop 30582 1726855373.67570: getting the next task for host managed_node3 30582 1726855373.67578: done getting next task for host managed_node3 30582 1726855373.67582: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855373.67590: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855373.67613: getting variables 30582 1726855373.67615: in VariableManager get_vars() 30582 1726855373.67663: Calling all_inventory to load vars for managed_node3 30582 1726855373.67665: Calling groups_inventory to load vars for managed_node3 30582 1726855373.67668: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855373.67680: Calling all_plugins_play to load vars for managed_node3 30582 1726855373.67683: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855373.67685: Calling groups_plugins_play to load vars for managed_node3 30582 1726855373.69306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.70995: done with get_vars() 30582 1726855373.71017: done getting variables 30582 1726855373.71078: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:53 -0400 (0:00:00.082) 0:01:50.061 ****** 30582 1726855373.71118: entering _queue_task() for managed_node3/fail 30582 1726855373.71470: worker is 1 (out of 1 available) 30582 1726855373.71484: exiting _queue_task() for managed_node3/fail 30582 1726855373.71897: done queuing things up, now waiting for results queue to drain 30582 1726855373.71900: waiting for pending results... 30582 1726855373.72358: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855373.72563: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021aa 30582 1726855373.72566: variable 'ansible_search_path' from source: unknown 30582 1726855373.72569: variable 'ansible_search_path' from source: unknown 30582 1726855373.72571: calling self._execute() 30582 1726855373.72749: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.72813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.72828: variable 'omit' from source: magic vars 30582 1726855373.73634: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.73653: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855373.73904: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855373.74246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855373.76633: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855373.76727: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855373.76769: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855373.76817: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855373.76847: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855373.76932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.77003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.77006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.77044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.77065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.77122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.77151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.77179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.77327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.77330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.77333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.77335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.77337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.77379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.77400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.77582: variable 'network_connections' from source: include params 30582 1726855373.77603: variable 'interface' from source: play vars 30582 1726855373.77678: variable 'interface' from source: play vars 30582 1726855373.77759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855373.77984: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855373.77989: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855373.78017: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855373.78051: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855373.78101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855373.78128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855373.78157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.78190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855373.78256: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855373.78514: variable 'network_connections' from source: include params 30582 1726855373.78636: variable 'interface' from source: play vars 30582 1726855373.78639: variable 'interface' from source: play vars 30582 1726855373.78641: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855373.78643: when evaluation is False, skipping this task 30582 1726855373.78645: _execute() done 30582 1726855373.78647: dumping result to json 30582 1726855373.78649: done dumping result, returning 30582 1726855373.78660: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000021aa] 30582 1726855373.78670: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021aa 30582 1726855373.78993: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021aa 30582 1726855373.78996: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855373.79049: no more pending results, returning what we have 30582 1726855373.79055: results queue empty 30582 1726855373.79056: checking for any_errors_fatal 30582 1726855373.79063: done checking for any_errors_fatal 30582 1726855373.79064: checking for max_fail_percentage 30582 1726855373.79066: done checking for max_fail_percentage 30582 1726855373.79068: checking to see if all hosts have failed and the running result is not ok 30582 1726855373.79068: done checking to see if all hosts have failed 30582 1726855373.79069: getting the remaining hosts for this loop 30582 1726855373.79071: done getting the remaining hosts for this loop 30582 1726855373.79074: getting the next task for host managed_node3 30582 1726855373.79083: done getting next task for host managed_node3 30582 1726855373.79089: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855373.79094: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855373.79117: getting variables 30582 1726855373.79119: in VariableManager get_vars() 30582 1726855373.79167: Calling all_inventory to load vars for managed_node3 30582 1726855373.79171: Calling groups_inventory to load vars for managed_node3 30582 1726855373.79173: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855373.79185: Calling all_plugins_play to load vars for managed_node3 30582 1726855373.79377: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855373.79382: Calling groups_plugins_play to load vars for managed_node3 30582 1726855373.80720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.82270: done with get_vars() 30582 1726855373.82298: done getting variables 30582 1726855373.82359: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:53 -0400 (0:00:00.112) 0:01:50.173 ****** 30582 1726855373.82400: entering _queue_task() for managed_node3/package 30582 1726855373.82764: worker is 1 (out of 1 available) 30582 1726855373.82776: exiting _queue_task() for managed_node3/package 30582 1726855373.82992: done queuing things up, now waiting for results queue to drain 30582 1726855373.82994: waiting for pending results... 30582 1726855373.83092: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855373.83237: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021ab 30582 1726855373.83258: variable 'ansible_search_path' from source: unknown 30582 1726855373.83267: variable 'ansible_search_path' from source: unknown 30582 1726855373.83309: calling self._execute() 30582 1726855373.83414: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855373.83424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855373.83443: variable 'omit' from source: magic vars 30582 1726855373.83830: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.83848: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855373.84049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855373.84331: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855373.84380: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855373.84427: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855373.84507: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855373.84642: variable 'network_packages' from source: role '' defaults 30582 1726855373.84742: variable '__network_provider_setup' from source: role '' defaults 30582 1726855373.84758: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855373.84910: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855373.84926: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855373.85005: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855373.85211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855373.87432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855373.87594: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855373.87598: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855373.87618: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855373.87648: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855373.88181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.88218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.88247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.88301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.88371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.88375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.88404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.88431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.88477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.88502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.88892: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855373.88895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.88897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.88899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.88942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.88960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.89071: variable 'ansible_python' from source: facts 30582 1726855373.89097: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855373.89194: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855373.89284: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855373.89430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.89470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.89502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.89544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.89574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.89628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855373.89674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855373.89783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.89786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855373.89792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855373.89926: variable 'network_connections' from source: include params 30582 1726855373.89936: variable 'interface' from source: play vars 30582 1726855373.90042: variable 'interface' from source: play vars 30582 1726855373.90126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855373.90154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855373.90186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855373.90231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855373.90278: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855373.90554: variable 'network_connections' from source: include params 30582 1726855373.90560: variable 'interface' from source: play vars 30582 1726855373.90666: variable 'interface' from source: play vars 30582 1726855373.90781: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855373.90807: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855373.91143: variable 'network_connections' from source: include params 30582 1726855373.91146: variable 'interface' from source: play vars 30582 1726855373.91219: variable 'interface' from source: play vars 30582 1726855373.91242: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855373.91395: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855373.91662: variable 'network_connections' from source: include params 30582 1726855373.91670: variable 'interface' from source: play vars 30582 1726855373.91732: variable 'interface' from source: play vars 30582 1726855373.91804: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855373.91869: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855373.91877: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855373.91937: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855373.92164: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855373.92810: variable 'network_connections' from source: include params 30582 1726855373.92813: variable 'interface' from source: play vars 30582 1726855373.92850: variable 'interface' from source: play vars 30582 1726855373.92861: variable 'ansible_distribution' from source: facts 30582 1726855373.92864: variable '__network_rh_distros' from source: role '' defaults 30582 1726855373.92870: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.92919: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855373.93099: variable 'ansible_distribution' from source: facts 30582 1726855373.93103: variable '__network_rh_distros' from source: role '' defaults 30582 1726855373.93192: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.93195: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855373.93294: variable 'ansible_distribution' from source: facts 30582 1726855373.93298: variable '__network_rh_distros' from source: role '' defaults 30582 1726855373.93303: variable 'ansible_distribution_major_version' from source: facts 30582 1726855373.93338: variable 'network_provider' from source: set_fact 30582 1726855373.93352: variable 'ansible_facts' from source: unknown 30582 1726855373.94099: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855373.94103: when evaluation is False, skipping this task 30582 1726855373.94105: _execute() done 30582 1726855373.94107: dumping result to json 30582 1726855373.94110: done dumping result, returning 30582 1726855373.94181: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-0000000021ab] 30582 1726855373.94184: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021ab 30582 1726855373.94250: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021ab 30582 1726855373.94253: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855373.94332: no more pending results, returning what we have 30582 1726855373.94335: results queue empty 30582 1726855373.94336: checking for any_errors_fatal 30582 1726855373.94344: done checking for any_errors_fatal 30582 1726855373.94345: checking for max_fail_percentage 30582 1726855373.94347: done checking for max_fail_percentage 30582 1726855373.94348: checking to see if all hosts have failed and the running result is not ok 30582 1726855373.94349: done checking to see if all hosts have failed 30582 1726855373.94349: getting the remaining hosts for this loop 30582 1726855373.94351: done getting the remaining hosts for this loop 30582 1726855373.94355: getting the next task for host managed_node3 30582 1726855373.94364: done getting next task for host managed_node3 30582 1726855373.94368: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855373.94374: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855373.94396: getting variables 30582 1726855373.94398: in VariableManager get_vars() 30582 1726855373.94445: Calling all_inventory to load vars for managed_node3 30582 1726855373.94448: Calling groups_inventory to load vars for managed_node3 30582 1726855373.94450: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855373.94459: Calling all_plugins_play to load vars for managed_node3 30582 1726855373.94462: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855373.94465: Calling groups_plugins_play to load vars for managed_node3 30582 1726855373.96330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855373.98018: done with get_vars() 30582 1726855373.98041: done getting variables 30582 1726855373.98109: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:53 -0400 (0:00:00.157) 0:01:50.331 ****** 30582 1726855373.98144: entering _queue_task() for managed_node3/package 30582 1726855373.99072: worker is 1 (out of 1 available) 30582 1726855373.99085: exiting _queue_task() for managed_node3/package 30582 1726855373.99200: done queuing things up, now waiting for results queue to drain 30582 1726855373.99202: waiting for pending results... 30582 1726855373.99806: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855373.99877: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021ac 30582 1726855373.99901: variable 'ansible_search_path' from source: unknown 30582 1726855373.99912: variable 'ansible_search_path' from source: unknown 30582 1726855374.00182: calling self._execute() 30582 1726855374.00278: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855374.00295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855374.00310: variable 'omit' from source: magic vars 30582 1726855374.00688: variable 'ansible_distribution_major_version' from source: facts 30582 1726855374.00706: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855374.00827: variable 'network_state' from source: role '' defaults 30582 1726855374.00842: Evaluated conditional (network_state != {}): False 30582 1726855374.00849: when evaluation is False, skipping this task 30582 1726855374.00856: _execute() done 30582 1726855374.00862: dumping result to json 30582 1726855374.00870: done dumping result, returning 30582 1726855374.00881: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-0000000021ac] 30582 1726855374.00895: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021ac skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855374.01048: no more pending results, returning what we have 30582 1726855374.01053: results queue empty 30582 1726855374.01054: checking for any_errors_fatal 30582 1726855374.01061: done checking for any_errors_fatal 30582 1726855374.01061: checking for max_fail_percentage 30582 1726855374.01066: done checking for max_fail_percentage 30582 1726855374.01066: checking to see if all hosts have failed and the running result is not ok 30582 1726855374.01067: done checking to see if all hosts have failed 30582 1726855374.01068: getting the remaining hosts for this loop 30582 1726855374.01069: done getting the remaining hosts for this loop 30582 1726855374.01073: getting the next task for host managed_node3 30582 1726855374.01082: done getting next task for host managed_node3 30582 1726855374.01091: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855374.01096: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855374.01125: getting variables 30582 1726855374.01127: in VariableManager get_vars() 30582 1726855374.01176: Calling all_inventory to load vars for managed_node3 30582 1726855374.01179: Calling groups_inventory to load vars for managed_node3 30582 1726855374.01181: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855374.01313: Calling all_plugins_play to load vars for managed_node3 30582 1726855374.01317: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855374.01321: Calling groups_plugins_play to load vars for managed_node3 30582 1726855374.01843: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021ac 30582 1726855374.01847: WORKER PROCESS EXITING 30582 1726855374.03251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855374.07017: done with get_vars() 30582 1726855374.07052: done getting variables 30582 1726855374.07319: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:54 -0400 (0:00:00.092) 0:01:50.423 ****** 30582 1726855374.07357: entering _queue_task() for managed_node3/package 30582 1726855374.08131: worker is 1 (out of 1 available) 30582 1726855374.08143: exiting _queue_task() for managed_node3/package 30582 1726855374.08154: done queuing things up, now waiting for results queue to drain 30582 1726855374.08155: waiting for pending results... 30582 1726855374.08510: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855374.08772: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021ad 30582 1726855374.08789: variable 'ansible_search_path' from source: unknown 30582 1726855374.08793: variable 'ansible_search_path' from source: unknown 30582 1726855374.08828: calling self._execute() 30582 1726855374.09142: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855374.09146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855374.09149: variable 'omit' from source: magic vars 30582 1726855374.10392: variable 'ansible_distribution_major_version' from source: facts 30582 1726855374.10397: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855374.10400: variable 'network_state' from source: role '' defaults 30582 1726855374.10403: Evaluated conditional (network_state != {}): False 30582 1726855374.10405: when evaluation is False, skipping this task 30582 1726855374.10408: _execute() done 30582 1726855374.10410: dumping result to json 30582 1726855374.10412: done dumping result, returning 30582 1726855374.10414: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-0000000021ad] 30582 1726855374.10416: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021ad 30582 1726855374.10493: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021ad 30582 1726855374.10496: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855374.10541: no more pending results, returning what we have 30582 1726855374.10545: results queue empty 30582 1726855374.10546: checking for any_errors_fatal 30582 1726855374.10554: done checking for any_errors_fatal 30582 1726855374.10554: checking for max_fail_percentage 30582 1726855374.10556: done checking for max_fail_percentage 30582 1726855374.10557: checking to see if all hosts have failed and the running result is not ok 30582 1726855374.10558: done checking to see if all hosts have failed 30582 1726855374.10559: getting the remaining hosts for this loop 30582 1726855374.10560: done getting the remaining hosts for this loop 30582 1726855374.10564: getting the next task for host managed_node3 30582 1726855374.10574: done getting next task for host managed_node3 30582 1726855374.10578: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855374.10585: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855374.10613: getting variables 30582 1726855374.10615: in VariableManager get_vars() 30582 1726855374.10666: Calling all_inventory to load vars for managed_node3 30582 1726855374.10669: Calling groups_inventory to load vars for managed_node3 30582 1726855374.10671: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855374.10684: Calling all_plugins_play to load vars for managed_node3 30582 1726855374.11022: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855374.11029: Calling groups_plugins_play to load vars for managed_node3 30582 1726855374.13833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855374.17358: done with get_vars() 30582 1726855374.17501: done getting variables 30582 1726855374.17570: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:54 -0400 (0:00:00.102) 0:01:50.526 ****** 30582 1726855374.17612: entering _queue_task() for managed_node3/service 30582 1726855374.18531: worker is 1 (out of 1 available) 30582 1726855374.18544: exiting _queue_task() for managed_node3/service 30582 1726855374.18556: done queuing things up, now waiting for results queue to drain 30582 1726855374.18557: waiting for pending results... 30582 1726855374.19094: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855374.19397: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021ae 30582 1726855374.19402: variable 'ansible_search_path' from source: unknown 30582 1726855374.19405: variable 'ansible_search_path' from source: unknown 30582 1726855374.19493: calling self._execute() 30582 1726855374.19738: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855374.19742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855374.19746: variable 'omit' from source: magic vars 30582 1726855374.20466: variable 'ansible_distribution_major_version' from source: facts 30582 1726855374.20510: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855374.20928: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855374.21395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855374.24098: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855374.24176: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855374.24225: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855374.24280: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855374.24315: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855374.24365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855374.24480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855374.24483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.24486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855374.24518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855374.24573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855374.24615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855374.24645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.24690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855374.24717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855374.24762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855374.24826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855374.24829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.24866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855374.24884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855374.25084: variable 'network_connections' from source: include params 30582 1726855374.25106: variable 'interface' from source: play vars 30582 1726855374.25182: variable 'interface' from source: play vars 30582 1726855374.25253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855374.25381: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855374.25427: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855374.25449: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855374.25502: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855374.25568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855374.25571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855374.25574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.25598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855374.25683: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855374.25906: variable 'network_connections' from source: include params 30582 1726855374.25917: variable 'interface' from source: play vars 30582 1726855374.25975: variable 'interface' from source: play vars 30582 1726855374.26015: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855374.26023: when evaluation is False, skipping this task 30582 1726855374.26025: _execute() done 30582 1726855374.26028: dumping result to json 30582 1726855374.26030: done dumping result, returning 30582 1726855374.26036: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000021ae] 30582 1726855374.26042: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021ae 30582 1726855374.26173: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021ae 30582 1726855374.26341: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855374.26401: no more pending results, returning what we have 30582 1726855374.26405: results queue empty 30582 1726855374.26406: checking for any_errors_fatal 30582 1726855374.26413: done checking for any_errors_fatal 30582 1726855374.26414: checking for max_fail_percentage 30582 1726855374.26416: done checking for max_fail_percentage 30582 1726855374.26417: checking to see if all hosts have failed and the running result is not ok 30582 1726855374.26418: done checking to see if all hosts have failed 30582 1726855374.26418: getting the remaining hosts for this loop 30582 1726855374.26420: done getting the remaining hosts for this loop 30582 1726855374.26424: getting the next task for host managed_node3 30582 1726855374.26432: done getting next task for host managed_node3 30582 1726855374.26436: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855374.26442: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855374.26462: getting variables 30582 1726855374.26464: in VariableManager get_vars() 30582 1726855374.26511: Calling all_inventory to load vars for managed_node3 30582 1726855374.26514: Calling groups_inventory to load vars for managed_node3 30582 1726855374.26517: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855374.26528: Calling all_plugins_play to load vars for managed_node3 30582 1726855374.26531: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855374.26534: Calling groups_plugins_play to load vars for managed_node3 30582 1726855374.27708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855374.29353: done with get_vars() 30582 1726855374.29379: done getting variables 30582 1726855374.29440: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:54 -0400 (0:00:00.118) 0:01:50.644 ****** 30582 1726855374.29476: entering _queue_task() for managed_node3/service 30582 1726855374.29825: worker is 1 (out of 1 available) 30582 1726855374.29839: exiting _queue_task() for managed_node3/service 30582 1726855374.29852: done queuing things up, now waiting for results queue to drain 30582 1726855374.29854: waiting for pending results... 30582 1726855374.30298: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855374.30304: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021af 30582 1726855374.30307: variable 'ansible_search_path' from source: unknown 30582 1726855374.30312: variable 'ansible_search_path' from source: unknown 30582 1726855374.30351: calling self._execute() 30582 1726855374.30458: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855374.30461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855374.30500: variable 'omit' from source: magic vars 30582 1726855374.30850: variable 'ansible_distribution_major_version' from source: facts 30582 1726855374.30868: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855374.31259: variable 'network_provider' from source: set_fact 30582 1726855374.31266: variable 'network_state' from source: role '' defaults 30582 1726855374.31269: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855374.31272: variable 'omit' from source: magic vars 30582 1726855374.31274: variable 'omit' from source: magic vars 30582 1726855374.31276: variable 'network_service_name' from source: role '' defaults 30582 1726855374.31278: variable 'network_service_name' from source: role '' defaults 30582 1726855374.31323: variable '__network_provider_setup' from source: role '' defaults 30582 1726855374.31329: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855374.31390: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855374.31590: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855374.31593: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855374.31694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855374.34189: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855374.34270: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855374.34305: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855374.34338: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855374.34370: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855374.34448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855374.34484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855374.34514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.34555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855374.34572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855374.34758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855374.34761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855374.34766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.34769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855374.34772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855374.34935: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855374.35196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855374.35199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855374.35201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.35204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855374.35206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855374.35237: variable 'ansible_python' from source: facts 30582 1726855374.35255: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855374.35339: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855374.35416: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855374.35538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855374.35570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855374.35591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.35631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855374.35642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855374.35693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855374.35715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855374.35737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.35779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855374.35795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855374.35924: variable 'network_connections' from source: include params 30582 1726855374.35932: variable 'interface' from source: play vars 30582 1726855374.36009: variable 'interface' from source: play vars 30582 1726855374.36115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855374.36312: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855374.36358: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855374.36402: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855374.36444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855374.36503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855374.36535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855374.36568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855374.36610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855374.36721: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855374.36932: variable 'network_connections' from source: include params 30582 1726855374.36936: variable 'interface' from source: play vars 30582 1726855374.37092: variable 'interface' from source: play vars 30582 1726855374.37095: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855374.37136: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855374.37505: variable 'network_connections' from source: include params 30582 1726855374.37508: variable 'interface' from source: play vars 30582 1726855374.37511: variable 'interface' from source: play vars 30582 1726855374.37533: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855374.37613: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855374.37903: variable 'network_connections' from source: include params 30582 1726855374.37906: variable 'interface' from source: play vars 30582 1726855374.37977: variable 'interface' from source: play vars 30582 1726855374.38151: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855374.38154: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855374.38157: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855374.38273: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855374.38381: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855374.38882: variable 'network_connections' from source: include params 30582 1726855374.38885: variable 'interface' from source: play vars 30582 1726855374.38949: variable 'interface' from source: play vars 30582 1726855374.38959: variable 'ansible_distribution' from source: facts 30582 1726855374.38962: variable '__network_rh_distros' from source: role '' defaults 30582 1726855374.38968: variable 'ansible_distribution_major_version' from source: facts 30582 1726855374.39002: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855374.39296: variable 'ansible_distribution' from source: facts 30582 1726855374.39300: variable '__network_rh_distros' from source: role '' defaults 30582 1726855374.39302: variable 'ansible_distribution_major_version' from source: facts 30582 1726855374.39304: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855374.39370: variable 'ansible_distribution' from source: facts 30582 1726855374.39375: variable '__network_rh_distros' from source: role '' defaults 30582 1726855374.39382: variable 'ansible_distribution_major_version' from source: facts 30582 1726855374.39420: variable 'network_provider' from source: set_fact 30582 1726855374.39444: variable 'omit' from source: magic vars 30582 1726855374.39478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855374.39514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855374.39526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855374.39545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855374.39557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855374.39592: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855374.39595: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855374.39598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855374.39792: Set connection var ansible_timeout to 10 30582 1726855374.39795: Set connection var ansible_connection to ssh 30582 1726855374.39797: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855374.39799: Set connection var ansible_pipelining to False 30582 1726855374.39801: Set connection var ansible_shell_executable to /bin/sh 30582 1726855374.39803: Set connection var ansible_shell_type to sh 30582 1726855374.39805: variable 'ansible_shell_executable' from source: unknown 30582 1726855374.39807: variable 'ansible_connection' from source: unknown 30582 1726855374.39809: variable 'ansible_module_compression' from source: unknown 30582 1726855374.39811: variable 'ansible_shell_type' from source: unknown 30582 1726855374.39813: variable 'ansible_shell_executable' from source: unknown 30582 1726855374.39814: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855374.39816: variable 'ansible_pipelining' from source: unknown 30582 1726855374.39818: variable 'ansible_timeout' from source: unknown 30582 1726855374.39820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855374.39874: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855374.39885: variable 'omit' from source: magic vars 30582 1726855374.39899: starting attempt loop 30582 1726855374.39902: running the handler 30582 1726855374.39975: variable 'ansible_facts' from source: unknown 30582 1726855374.40821: _low_level_execute_command(): starting 30582 1726855374.40824: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855374.41437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855374.41448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855374.41461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855374.41476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855374.41491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855374.41499: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855374.41528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855374.41628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855374.41633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855374.41635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855374.41729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855374.43540: stdout chunk (state=3): >>>/root <<< 30582 1726855374.43811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855374.43815: stdout chunk (state=3): >>><<< 30582 1726855374.43818: stderr chunk (state=3): >>><<< 30582 1726855374.43904: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855374.43909: _low_level_execute_command(): starting 30582 1726855374.43915: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559 `" && echo ansible-tmp-1726855374.4384336-35730-158386270978559="` echo /root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559 `" ) && sleep 0' 30582 1726855374.45299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855374.45303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855374.45308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855374.45310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855374.45312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855374.45358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855374.45623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855374.45675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855374.47641: stdout chunk (state=3): >>>ansible-tmp-1726855374.4384336-35730-158386270978559=/root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559 <<< 30582 1726855374.47831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855374.48029: stderr chunk (state=3): >>><<< 30582 1726855374.48033: stdout chunk (state=3): >>><<< 30582 1726855374.48035: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855374.4384336-35730-158386270978559=/root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855374.48037: variable 'ansible_module_compression' from source: unknown 30582 1726855374.48047: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855374.48205: variable 'ansible_facts' from source: unknown 30582 1726855374.48614: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/AnsiballZ_systemd.py 30582 1726855374.49129: Sending initial data 30582 1726855374.49133: Sent initial data (156 bytes) 30582 1726855374.50124: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855374.50205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855374.50373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855374.50391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855374.50483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855374.52122: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30582 1726855374.52127: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30582 1726855374.52135: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30582 1726855374.52142: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 30582 1726855374.52155: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855374.52243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855374.52306: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpq93si2mi /root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/AnsiballZ_systemd.py <<< 30582 1726855374.52322: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/AnsiballZ_systemd.py" <<< 30582 1726855374.52365: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpq93si2mi" to remote "/root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/AnsiballZ_systemd.py" <<< 30582 1726855374.54440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855374.54472: stderr chunk (state=3): >>><<< 30582 1726855374.54476: stdout chunk (state=3): >>><<< 30582 1726855374.54488: done transferring module to remote 30582 1726855374.54498: _low_level_execute_command(): starting 30582 1726855374.54503: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/ /root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/AnsiballZ_systemd.py && sleep 0' 30582 1726855374.54935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855374.54939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855374.54941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855374.54943: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855374.54945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855374.54989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855374.54992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855374.55065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855374.56940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855374.56944: stdout chunk (state=3): >>><<< 30582 1726855374.56946: stderr chunk (state=3): >>><<< 30582 1726855374.56970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855374.57048: _low_level_execute_command(): starting 30582 1726855374.57053: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/AnsiballZ_systemd.py && sleep 0' 30582 1726855374.57614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855374.57617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855374.57619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855374.57621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855374.57636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855374.57642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855374.57648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855374.57752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855374.57827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855374.86945: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10620928", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3320078336", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2285424000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855374.86976: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.s<<< 30582 1726855374.86991: stdout chunk (state=3): >>>ocket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855374.88985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855374.88991: stdout chunk (state=3): >>><<< 30582 1726855374.88993: stderr chunk (state=3): >>><<< 30582 1726855374.88998: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10620928", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3320078336", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2285424000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855374.89169: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855374.89198: _low_level_execute_command(): starting 30582 1726855374.89208: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855374.4384336-35730-158386270978559/ > /dev/null 2>&1 && sleep 0' 30582 1726855374.89889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855374.89995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855374.90015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855374.90029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855374.90052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855374.90073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855374.90171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855374.92047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855374.92058: stdout chunk (state=3): >>><<< 30582 1726855374.92074: stderr chunk (state=3): >>><<< 30582 1726855374.92098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855374.92119: handler run complete 30582 1726855374.92190: attempt loop complete, returning result 30582 1726855374.92199: _execute() done 30582 1726855374.92292: dumping result to json 30582 1726855374.92295: done dumping result, returning 30582 1726855374.92297: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-0000000021af] 30582 1726855374.92299: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021af 30582 1726855374.93344: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021af 30582 1726855374.93348: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855374.93441: no more pending results, returning what we have 30582 1726855374.93444: results queue empty 30582 1726855374.93446: checking for any_errors_fatal 30582 1726855374.93449: done checking for any_errors_fatal 30582 1726855374.93450: checking for max_fail_percentage 30582 1726855374.93452: done checking for max_fail_percentage 30582 1726855374.93453: checking to see if all hosts have failed and the running result is not ok 30582 1726855374.93454: done checking to see if all hosts have failed 30582 1726855374.93455: getting the remaining hosts for this loop 30582 1726855374.93456: done getting the remaining hosts for this loop 30582 1726855374.93459: getting the next task for host managed_node3 30582 1726855374.93468: done getting next task for host managed_node3 30582 1726855374.93472: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855374.93481: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855374.93495: getting variables 30582 1726855374.93497: in VariableManager get_vars() 30582 1726855374.93528: Calling all_inventory to load vars for managed_node3 30582 1726855374.93530: Calling groups_inventory to load vars for managed_node3 30582 1726855374.93533: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855374.93549: Calling all_plugins_play to load vars for managed_node3 30582 1726855374.93552: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855374.93556: Calling groups_plugins_play to load vars for managed_node3 30582 1726855374.94885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855374.96617: done with get_vars() 30582 1726855374.96643: done getting variables 30582 1726855374.96719: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:54 -0400 (0:00:00.672) 0:01:51.317 ****** 30582 1726855374.96762: entering _queue_task() for managed_node3/service 30582 1726855374.97416: worker is 1 (out of 1 available) 30582 1726855374.97425: exiting _queue_task() for managed_node3/service 30582 1726855374.97436: done queuing things up, now waiting for results queue to drain 30582 1726855374.97438: waiting for pending results... 30582 1726855374.97577: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855374.97709: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021b0 30582 1726855374.97732: variable 'ansible_search_path' from source: unknown 30582 1726855374.97742: variable 'ansible_search_path' from source: unknown 30582 1726855374.97797: calling self._execute() 30582 1726855374.97997: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855374.98000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855374.98003: variable 'omit' from source: magic vars 30582 1726855374.98342: variable 'ansible_distribution_major_version' from source: facts 30582 1726855374.98359: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855374.98490: variable 'network_provider' from source: set_fact 30582 1726855374.98503: Evaluated conditional (network_provider == "nm"): True 30582 1726855374.98608: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855374.98711: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855374.98906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855375.00685: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855375.00734: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855375.00761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855375.00789: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855375.00809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855375.01021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855375.01043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855375.01062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855375.01091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855375.01102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855375.01136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855375.01155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855375.01173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855375.01199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855375.01209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855375.01237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855375.01258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855375.01273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855375.01299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855375.01309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855375.01423: variable 'network_connections' from source: include params 30582 1726855375.01442: variable 'interface' from source: play vars 30582 1726855375.01501: variable 'interface' from source: play vars 30582 1726855375.01570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855375.01708: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855375.01893: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855375.01896: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855375.01898: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855375.01901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855375.01903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855375.01905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855375.01907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855375.01939: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855375.02176: variable 'network_connections' from source: include params 30582 1726855375.02180: variable 'interface' from source: play vars 30582 1726855375.02247: variable 'interface' from source: play vars 30582 1726855375.02286: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855375.02291: when evaluation is False, skipping this task 30582 1726855375.02294: _execute() done 30582 1726855375.02297: dumping result to json 30582 1726855375.02299: done dumping result, returning 30582 1726855375.02309: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-0000000021b0] 30582 1726855375.02320: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b0 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855375.02493: no more pending results, returning what we have 30582 1726855375.02497: results queue empty 30582 1726855375.02498: checking for any_errors_fatal 30582 1726855375.02527: done checking for any_errors_fatal 30582 1726855375.02527: checking for max_fail_percentage 30582 1726855375.02530: done checking for max_fail_percentage 30582 1726855375.02531: checking to see if all hosts have failed and the running result is not ok 30582 1726855375.02532: done checking to see if all hosts have failed 30582 1726855375.02532: getting the remaining hosts for this loop 30582 1726855375.02534: done getting the remaining hosts for this loop 30582 1726855375.02538: getting the next task for host managed_node3 30582 1726855375.02546: done getting next task for host managed_node3 30582 1726855375.02550: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855375.02555: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855375.02577: getting variables 30582 1726855375.02579: in VariableManager get_vars() 30582 1726855375.02650: Calling all_inventory to load vars for managed_node3 30582 1726855375.02654: Calling groups_inventory to load vars for managed_node3 30582 1726855375.02656: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855375.02685: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b0 30582 1726855375.02690: WORKER PROCESS EXITING 30582 1726855375.02699: Calling all_plugins_play to load vars for managed_node3 30582 1726855375.02708: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855375.02712: Calling groups_plugins_play to load vars for managed_node3 30582 1726855375.04016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855375.04912: done with get_vars() 30582 1726855375.04930: done getting variables 30582 1726855375.04976: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:55 -0400 (0:00:00.082) 0:01:51.399 ****** 30582 1726855375.05005: entering _queue_task() for managed_node3/service 30582 1726855375.05518: worker is 1 (out of 1 available) 30582 1726855375.05530: exiting _queue_task() for managed_node3/service 30582 1726855375.05541: done queuing things up, now waiting for results queue to drain 30582 1726855375.05543: waiting for pending results... 30582 1726855375.05704: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855375.05911: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021b1 30582 1726855375.05915: variable 'ansible_search_path' from source: unknown 30582 1726855375.05918: variable 'ansible_search_path' from source: unknown 30582 1726855375.05921: calling self._execute() 30582 1726855375.05983: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.06201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.06212: variable 'omit' from source: magic vars 30582 1726855375.06751: variable 'ansible_distribution_major_version' from source: facts 30582 1726855375.06772: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855375.06895: variable 'network_provider' from source: set_fact 30582 1726855375.06899: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855375.06902: when evaluation is False, skipping this task 30582 1726855375.06904: _execute() done 30582 1726855375.06907: dumping result to json 30582 1726855375.06910: done dumping result, returning 30582 1726855375.06924: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-0000000021b1] 30582 1726855375.06926: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b1 30582 1726855375.07039: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b1 30582 1726855375.07043: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855375.07110: no more pending results, returning what we have 30582 1726855375.07115: results queue empty 30582 1726855375.07115: checking for any_errors_fatal 30582 1726855375.07124: done checking for any_errors_fatal 30582 1726855375.07125: checking for max_fail_percentage 30582 1726855375.07127: done checking for max_fail_percentage 30582 1726855375.07128: checking to see if all hosts have failed and the running result is not ok 30582 1726855375.07129: done checking to see if all hosts have failed 30582 1726855375.07129: getting the remaining hosts for this loop 30582 1726855375.07131: done getting the remaining hosts for this loop 30582 1726855375.07135: getting the next task for host managed_node3 30582 1726855375.07143: done getting next task for host managed_node3 30582 1726855375.07147: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855375.07154: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855375.07181: getting variables 30582 1726855375.07183: in VariableManager get_vars() 30582 1726855375.07230: Calling all_inventory to load vars for managed_node3 30582 1726855375.07233: Calling groups_inventory to load vars for managed_node3 30582 1726855375.07235: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855375.07245: Calling all_plugins_play to load vars for managed_node3 30582 1726855375.07248: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855375.07250: Calling groups_plugins_play to load vars for managed_node3 30582 1726855375.08091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855375.09369: done with get_vars() 30582 1726855375.09398: done getting variables 30582 1726855375.09461: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:55 -0400 (0:00:00.044) 0:01:51.444 ****** 30582 1726855375.09505: entering _queue_task() for managed_node3/copy 30582 1726855375.10116: worker is 1 (out of 1 available) 30582 1726855375.10126: exiting _queue_task() for managed_node3/copy 30582 1726855375.10136: done queuing things up, now waiting for results queue to drain 30582 1726855375.10137: waiting for pending results... 30582 1726855375.10375: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855375.10382: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021b2 30582 1726855375.10385: variable 'ansible_search_path' from source: unknown 30582 1726855375.10390: variable 'ansible_search_path' from source: unknown 30582 1726855375.10417: calling self._execute() 30582 1726855375.10579: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.10583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.10589: variable 'omit' from source: magic vars 30582 1726855375.10922: variable 'ansible_distribution_major_version' from source: facts 30582 1726855375.10935: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855375.11056: variable 'network_provider' from source: set_fact 30582 1726855375.11062: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855375.11065: when evaluation is False, skipping this task 30582 1726855375.11071: _execute() done 30582 1726855375.11074: dumping result to json 30582 1726855375.11076: done dumping result, returning 30582 1726855375.11122: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-0000000021b2] 30582 1726855375.11125: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b2 30582 1726855375.11194: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b2 30582 1726855375.11198: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855375.11250: no more pending results, returning what we have 30582 1726855375.11254: results queue empty 30582 1726855375.11256: checking for any_errors_fatal 30582 1726855375.11266: done checking for any_errors_fatal 30582 1726855375.11267: checking for max_fail_percentage 30582 1726855375.11269: done checking for max_fail_percentage 30582 1726855375.11270: checking to see if all hosts have failed and the running result is not ok 30582 1726855375.11271: done checking to see if all hosts have failed 30582 1726855375.11272: getting the remaining hosts for this loop 30582 1726855375.11273: done getting the remaining hosts for this loop 30582 1726855375.11277: getting the next task for host managed_node3 30582 1726855375.11286: done getting next task for host managed_node3 30582 1726855375.11291: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855375.11297: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855375.11323: getting variables 30582 1726855375.11325: in VariableManager get_vars() 30582 1726855375.11379: Calling all_inventory to load vars for managed_node3 30582 1726855375.11382: Calling groups_inventory to load vars for managed_node3 30582 1726855375.11385: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855375.11597: Calling all_plugins_play to load vars for managed_node3 30582 1726855375.11601: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855375.11605: Calling groups_plugins_play to load vars for managed_node3 30582 1726855375.13152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855375.14674: done with get_vars() 30582 1726855375.14702: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:55 -0400 (0:00:00.052) 0:01:51.497 ****** 30582 1726855375.14791: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855375.15196: worker is 1 (out of 1 available) 30582 1726855375.15210: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855375.15223: done queuing things up, now waiting for results queue to drain 30582 1726855375.15225: waiting for pending results... 30582 1726855375.15643: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855375.15739: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021b3 30582 1726855375.15744: variable 'ansible_search_path' from source: unknown 30582 1726855375.15747: variable 'ansible_search_path' from source: unknown 30582 1726855375.15848: calling self._execute() 30582 1726855375.15901: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.15905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.15910: variable 'omit' from source: magic vars 30582 1726855375.16296: variable 'ansible_distribution_major_version' from source: facts 30582 1726855375.16307: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855375.16314: variable 'omit' from source: magic vars 30582 1726855375.16392: variable 'omit' from source: magic vars 30582 1726855375.16554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855375.18592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855375.18640: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855375.18670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855375.18698: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855375.18721: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855375.18784: variable 'network_provider' from source: set_fact 30582 1726855375.18888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855375.18908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855375.18928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855375.18953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855375.18963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855375.19021: variable 'omit' from source: magic vars 30582 1726855375.19101: variable 'omit' from source: magic vars 30582 1726855375.19173: variable 'network_connections' from source: include params 30582 1726855375.19183: variable 'interface' from source: play vars 30582 1726855375.19228: variable 'interface' from source: play vars 30582 1726855375.19339: variable 'omit' from source: magic vars 30582 1726855375.19346: variable '__lsr_ansible_managed' from source: task vars 30582 1726855375.19393: variable '__lsr_ansible_managed' from source: task vars 30582 1726855375.19530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855375.19670: Loaded config def from plugin (lookup/template) 30582 1726855375.19674: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855375.19699: File lookup term: get_ansible_managed.j2 30582 1726855375.19702: variable 'ansible_search_path' from source: unknown 30582 1726855375.19705: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855375.19716: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855375.19731: variable 'ansible_search_path' from source: unknown 30582 1726855375.34796: variable 'ansible_managed' from source: unknown 30582 1726855375.34892: variable 'omit' from source: magic vars 30582 1726855375.34925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855375.34953: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855375.34975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855375.34996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855375.35008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855375.35030: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855375.35037: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.35092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.35148: Set connection var ansible_timeout to 10 30582 1726855375.35157: Set connection var ansible_connection to ssh 30582 1726855375.35176: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855375.35189: Set connection var ansible_pipelining to False 30582 1726855375.35201: Set connection var ansible_shell_executable to /bin/sh 30582 1726855375.35208: Set connection var ansible_shell_type to sh 30582 1726855375.35236: variable 'ansible_shell_executable' from source: unknown 30582 1726855375.35244: variable 'ansible_connection' from source: unknown 30582 1726855375.35292: variable 'ansible_module_compression' from source: unknown 30582 1726855375.35295: variable 'ansible_shell_type' from source: unknown 30582 1726855375.35297: variable 'ansible_shell_executable' from source: unknown 30582 1726855375.35299: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.35302: variable 'ansible_pipelining' from source: unknown 30582 1726855375.35304: variable 'ansible_timeout' from source: unknown 30582 1726855375.35306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.35420: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855375.35445: variable 'omit' from source: magic vars 30582 1726855375.35458: starting attempt loop 30582 1726855375.35492: running the handler 30582 1726855375.35495: _low_level_execute_command(): starting 30582 1726855375.35501: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855375.36195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855375.36225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855375.36241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855375.36274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855375.36483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855375.36539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855375.36602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855375.38291: stdout chunk (state=3): >>>/root <<< 30582 1726855375.38450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855375.38454: stdout chunk (state=3): >>><<< 30582 1726855375.38456: stderr chunk (state=3): >>><<< 30582 1726855375.38578: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855375.38582: _low_level_execute_command(): starting 30582 1726855375.38584: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675 `" && echo ansible-tmp-1726855375.3848379-35763-156027635076675="` echo /root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675 `" ) && sleep 0' 30582 1726855375.39164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855375.39181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855375.39199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855375.39244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855375.39263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855375.39310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855375.39379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855375.39414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855375.39454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855375.39516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855375.41683: stdout chunk (state=3): >>>ansible-tmp-1726855375.3848379-35763-156027635076675=/root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675 <<< 30582 1726855375.41814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855375.41854: stderr chunk (state=3): >>><<< 30582 1726855375.41873: stdout chunk (state=3): >>><<< 30582 1726855375.42105: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855375.3848379-35763-156027635076675=/root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855375.42112: variable 'ansible_module_compression' from source: unknown 30582 1726855375.42115: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855375.42117: variable 'ansible_facts' from source: unknown 30582 1726855375.42177: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/AnsiballZ_network_connections.py 30582 1726855375.42397: Sending initial data 30582 1726855375.42599: Sent initial data (168 bytes) 30582 1726855375.42877: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855375.42894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855375.42910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855375.42934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855375.42950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855375.42961: stderr chunk (state=3): >>>debug2: match not found <<< 30582 1726855375.42974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855375.42994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855375.43005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855375.43015: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30582 1726855375.43026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855375.43038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855375.43054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855375.43108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855375.43145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855375.43163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855375.43185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855375.43274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855375.44970: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855375.45053: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855375.45117: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpnrngu83s /root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/AnsiballZ_network_connections.py <<< 30582 1726855375.45134: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/AnsiballZ_network_connections.py" <<< 30582 1726855375.45264: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpnrngu83s" to remote "/root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/AnsiballZ_network_connections.py" <<< 30582 1726855375.46966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855375.46990: stderr chunk (state=3): >>><<< 30582 1726855375.47007: stdout chunk (state=3): >>><<< 30582 1726855375.47114: done transferring module to remote 30582 1726855375.47117: _low_level_execute_command(): starting 30582 1726855375.47120: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/ /root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/AnsiballZ_network_connections.py && sleep 0' 30582 1726855375.47732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855375.47747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855375.47780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855375.47801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855375.47912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855375.47958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855375.48018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855375.49995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855375.49999: stdout chunk (state=3): >>><<< 30582 1726855375.50002: stderr chunk (state=3): >>><<< 30582 1726855375.50004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855375.50007: _low_level_execute_command(): starting 30582 1726855375.50009: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/AnsiballZ_network_connections.py && sleep 0' 30582 1726855375.50628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855375.50640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855375.50650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855375.50703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855375.50766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855375.50800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855375.50903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855375.78719: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 30582 1726855375.78838: stdout chunk (state=3): >>> <<< 30582 1726855375.82111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855375.82142: stderr chunk (state=3): >>><<< 30582 1726855375.82145: stdout chunk (state=3): >>><<< 30582 1726855375.82163: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855375.82198: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855375.82205: _low_level_execute_command(): starting 30582 1726855375.82210: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855375.3848379-35763-156027635076675/ > /dev/null 2>&1 && sleep 0' 30582 1726855375.82675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855375.82678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855375.82680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855375.82683: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855375.82685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855375.82730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855375.82733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855375.82735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855375.82809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855375.84734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855375.84755: stderr chunk (state=3): >>><<< 30582 1726855375.84758: stdout chunk (state=3): >>><<< 30582 1726855375.84773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855375.84779: handler run complete 30582 1726855375.84803: attempt loop complete, returning result 30582 1726855375.84811: _execute() done 30582 1726855375.84813: dumping result to json 30582 1726855375.84816: done dumping result, returning 30582 1726855375.84825: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-0000000021b3] 30582 1726855375.84827: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b3 30582 1726855375.84935: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b3 30582 1726855375.84939: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76 30582 1726855375.85036: no more pending results, returning what we have 30582 1726855375.85040: results queue empty 30582 1726855375.85041: checking for any_errors_fatal 30582 1726855375.85047: done checking for any_errors_fatal 30582 1726855375.85048: checking for max_fail_percentage 30582 1726855375.85050: done checking for max_fail_percentage 30582 1726855375.85051: checking to see if all hosts have failed and the running result is not ok 30582 1726855375.85052: done checking to see if all hosts have failed 30582 1726855375.85053: getting the remaining hosts for this loop 30582 1726855375.85054: done getting the remaining hosts for this loop 30582 1726855375.85058: getting the next task for host managed_node3 30582 1726855375.85066: done getting next task for host managed_node3 30582 1726855375.85070: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855375.85075: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855375.85086: getting variables 30582 1726855375.85095: in VariableManager get_vars() 30582 1726855375.85134: Calling all_inventory to load vars for managed_node3 30582 1726855375.85136: Calling groups_inventory to load vars for managed_node3 30582 1726855375.85138: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855375.85147: Calling all_plugins_play to load vars for managed_node3 30582 1726855375.85149: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855375.85152: Calling groups_plugins_play to load vars for managed_node3 30582 1726855375.86027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855375.87053: done with get_vars() 30582 1726855375.87074: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:55 -0400 (0:00:00.723) 0:01:52.221 ****** 30582 1726855375.87140: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855375.87424: worker is 1 (out of 1 available) 30582 1726855375.87438: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855375.87452: done queuing things up, now waiting for results queue to drain 30582 1726855375.87454: waiting for pending results... 30582 1726855375.87645: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855375.87740: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021b4 30582 1726855375.87753: variable 'ansible_search_path' from source: unknown 30582 1726855375.87757: variable 'ansible_search_path' from source: unknown 30582 1726855375.87786: calling self._execute() 30582 1726855375.87870: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.87873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.87880: variable 'omit' from source: magic vars 30582 1726855375.88167: variable 'ansible_distribution_major_version' from source: facts 30582 1726855375.88174: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855375.88262: variable 'network_state' from source: role '' defaults 30582 1726855375.88271: Evaluated conditional (network_state != {}): False 30582 1726855375.88274: when evaluation is False, skipping this task 30582 1726855375.88277: _execute() done 30582 1726855375.88279: dumping result to json 30582 1726855375.88282: done dumping result, returning 30582 1726855375.88291: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-0000000021b4] 30582 1726855375.88339: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b4 30582 1726855375.88405: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b4 30582 1726855375.88409: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855375.88492: no more pending results, returning what we have 30582 1726855375.88496: results queue empty 30582 1726855375.88497: checking for any_errors_fatal 30582 1726855375.88506: done checking for any_errors_fatal 30582 1726855375.88506: checking for max_fail_percentage 30582 1726855375.88508: done checking for max_fail_percentage 30582 1726855375.88509: checking to see if all hosts have failed and the running result is not ok 30582 1726855375.88509: done checking to see if all hosts have failed 30582 1726855375.88510: getting the remaining hosts for this loop 30582 1726855375.88511: done getting the remaining hosts for this loop 30582 1726855375.88514: getting the next task for host managed_node3 30582 1726855375.88523: done getting next task for host managed_node3 30582 1726855375.88526: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855375.88531: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855375.88550: getting variables 30582 1726855375.88551: in VariableManager get_vars() 30582 1726855375.88590: Calling all_inventory to load vars for managed_node3 30582 1726855375.88593: Calling groups_inventory to load vars for managed_node3 30582 1726855375.88595: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855375.88604: Calling all_plugins_play to load vars for managed_node3 30582 1726855375.88606: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855375.88609: Calling groups_plugins_play to load vars for managed_node3 30582 1726855375.89425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855375.90307: done with get_vars() 30582 1726855375.90325: done getting variables 30582 1726855375.90370: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:55 -0400 (0:00:00.032) 0:01:52.253 ****** 30582 1726855375.90402: entering _queue_task() for managed_node3/debug 30582 1726855375.90668: worker is 1 (out of 1 available) 30582 1726855375.90684: exiting _queue_task() for managed_node3/debug 30582 1726855375.90698: done queuing things up, now waiting for results queue to drain 30582 1726855375.90700: waiting for pending results... 30582 1726855375.90894: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855375.90997: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021b5 30582 1726855375.91010: variable 'ansible_search_path' from source: unknown 30582 1726855375.91013: variable 'ansible_search_path' from source: unknown 30582 1726855375.91043: calling self._execute() 30582 1726855375.91121: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.91125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.91132: variable 'omit' from source: magic vars 30582 1726855375.91403: variable 'ansible_distribution_major_version' from source: facts 30582 1726855375.91413: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855375.91418: variable 'omit' from source: magic vars 30582 1726855375.91466: variable 'omit' from source: magic vars 30582 1726855375.91490: variable 'omit' from source: magic vars 30582 1726855375.91522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855375.91547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855375.91563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855375.91578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855375.91591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855375.91615: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855375.91618: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.91620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.91692: Set connection var ansible_timeout to 10 30582 1726855375.91695: Set connection var ansible_connection to ssh 30582 1726855375.91706: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855375.91709: Set connection var ansible_pipelining to False 30582 1726855375.91711: Set connection var ansible_shell_executable to /bin/sh 30582 1726855375.91713: Set connection var ansible_shell_type to sh 30582 1726855375.91730: variable 'ansible_shell_executable' from source: unknown 30582 1726855375.91733: variable 'ansible_connection' from source: unknown 30582 1726855375.91735: variable 'ansible_module_compression' from source: unknown 30582 1726855375.91738: variable 'ansible_shell_type' from source: unknown 30582 1726855375.91740: variable 'ansible_shell_executable' from source: unknown 30582 1726855375.91742: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.91745: variable 'ansible_pipelining' from source: unknown 30582 1726855375.91747: variable 'ansible_timeout' from source: unknown 30582 1726855375.91752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.91854: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855375.91866: variable 'omit' from source: magic vars 30582 1726855375.91869: starting attempt loop 30582 1726855375.91872: running the handler 30582 1726855375.91968: variable '__network_connections_result' from source: set_fact 30582 1726855375.92007: handler run complete 30582 1726855375.92020: attempt loop complete, returning result 30582 1726855375.92025: _execute() done 30582 1726855375.92028: dumping result to json 30582 1726855375.92030: done dumping result, returning 30582 1726855375.92041: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-0000000021b5] 30582 1726855375.92043: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b5 30582 1726855375.92124: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b5 30582 1726855375.92126: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76" ] } 30582 1726855375.92205: no more pending results, returning what we have 30582 1726855375.92209: results queue empty 30582 1726855375.92210: checking for any_errors_fatal 30582 1726855375.92216: done checking for any_errors_fatal 30582 1726855375.92217: checking for max_fail_percentage 30582 1726855375.92219: done checking for max_fail_percentage 30582 1726855375.92220: checking to see if all hosts have failed and the running result is not ok 30582 1726855375.92220: done checking to see if all hosts have failed 30582 1726855375.92221: getting the remaining hosts for this loop 30582 1726855375.92223: done getting the remaining hosts for this loop 30582 1726855375.92226: getting the next task for host managed_node3 30582 1726855375.92234: done getting next task for host managed_node3 30582 1726855375.92237: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855375.92242: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855375.92255: getting variables 30582 1726855375.92256: in VariableManager get_vars() 30582 1726855375.92298: Calling all_inventory to load vars for managed_node3 30582 1726855375.92300: Calling groups_inventory to load vars for managed_node3 30582 1726855375.92302: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855375.92312: Calling all_plugins_play to load vars for managed_node3 30582 1726855375.92315: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855375.92317: Calling groups_plugins_play to load vars for managed_node3 30582 1726855375.93250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855375.98615: done with get_vars() 30582 1726855375.98636: done getting variables 30582 1726855375.98675: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:55 -0400 (0:00:00.083) 0:01:52.336 ****** 30582 1726855375.98706: entering _queue_task() for managed_node3/debug 30582 1726855375.98985: worker is 1 (out of 1 available) 30582 1726855375.99001: exiting _queue_task() for managed_node3/debug 30582 1726855375.99013: done queuing things up, now waiting for results queue to drain 30582 1726855375.99016: waiting for pending results... 30582 1726855375.99203: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855375.99308: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021b6 30582 1726855375.99321: variable 'ansible_search_path' from source: unknown 30582 1726855375.99326: variable 'ansible_search_path' from source: unknown 30582 1726855375.99358: calling self._execute() 30582 1726855375.99435: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.99440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855375.99448: variable 'omit' from source: magic vars 30582 1726855375.99733: variable 'ansible_distribution_major_version' from source: facts 30582 1726855375.99742: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855375.99748: variable 'omit' from source: magic vars 30582 1726855375.99797: variable 'omit' from source: magic vars 30582 1726855375.99818: variable 'omit' from source: magic vars 30582 1726855375.99852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855375.99878: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855375.99897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855375.99914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855375.99924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855375.99946: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855375.99950: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855375.99952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.00025: Set connection var ansible_timeout to 10 30582 1726855376.00029: Set connection var ansible_connection to ssh 30582 1726855376.00034: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855376.00040: Set connection var ansible_pipelining to False 30582 1726855376.00045: Set connection var ansible_shell_executable to /bin/sh 30582 1726855376.00047: Set connection var ansible_shell_type to sh 30582 1726855376.00066: variable 'ansible_shell_executable' from source: unknown 30582 1726855376.00069: variable 'ansible_connection' from source: unknown 30582 1726855376.00072: variable 'ansible_module_compression' from source: unknown 30582 1726855376.00074: variable 'ansible_shell_type' from source: unknown 30582 1726855376.00076: variable 'ansible_shell_executable' from source: unknown 30582 1726855376.00079: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.00081: variable 'ansible_pipelining' from source: unknown 30582 1726855376.00083: variable 'ansible_timeout' from source: unknown 30582 1726855376.00085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.00183: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855376.00194: variable 'omit' from source: magic vars 30582 1726855376.00200: starting attempt loop 30582 1726855376.00203: running the handler 30582 1726855376.00243: variable '__network_connections_result' from source: set_fact 30582 1726855376.00305: variable '__network_connections_result' from source: set_fact 30582 1726855376.00392: handler run complete 30582 1726855376.00409: attempt loop complete, returning result 30582 1726855376.00413: _execute() done 30582 1726855376.00416: dumping result to json 30582 1726855376.00419: done dumping result, returning 30582 1726855376.00428: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-0000000021b6] 30582 1726855376.00432: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b6 30582 1726855376.00527: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b6 30582 1726855376.00530: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76" ] } } 30582 1726855376.00635: no more pending results, returning what we have 30582 1726855376.00639: results queue empty 30582 1726855376.00640: checking for any_errors_fatal 30582 1726855376.00645: done checking for any_errors_fatal 30582 1726855376.00645: checking for max_fail_percentage 30582 1726855376.00647: done checking for max_fail_percentage 30582 1726855376.00648: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.00649: done checking to see if all hosts have failed 30582 1726855376.00649: getting the remaining hosts for this loop 30582 1726855376.00652: done getting the remaining hosts for this loop 30582 1726855376.00655: getting the next task for host managed_node3 30582 1726855376.00662: done getting next task for host managed_node3 30582 1726855376.00668: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855376.00673: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.00685: getting variables 30582 1726855376.00689: in VariableManager get_vars() 30582 1726855376.00730: Calling all_inventory to load vars for managed_node3 30582 1726855376.00732: Calling groups_inventory to load vars for managed_node3 30582 1726855376.00734: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.00742: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.00744: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.00746: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.01554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.02446: done with get_vars() 30582 1726855376.02463: done getting variables 30582 1726855376.02508: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:56 -0400 (0:00:00.038) 0:01:52.375 ****** 30582 1726855376.02533: entering _queue_task() for managed_node3/debug 30582 1726855376.02784: worker is 1 (out of 1 available) 30582 1726855376.02799: exiting _queue_task() for managed_node3/debug 30582 1726855376.02811: done queuing things up, now waiting for results queue to drain 30582 1726855376.02813: waiting for pending results... 30582 1726855376.02996: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855376.03101: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021b7 30582 1726855376.03113: variable 'ansible_search_path' from source: unknown 30582 1726855376.03116: variable 'ansible_search_path' from source: unknown 30582 1726855376.03151: calling self._execute() 30582 1726855376.03230: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.03234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.03243: variable 'omit' from source: magic vars 30582 1726855376.03536: variable 'ansible_distribution_major_version' from source: facts 30582 1726855376.03545: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855376.03635: variable 'network_state' from source: role '' defaults 30582 1726855376.03643: Evaluated conditional (network_state != {}): False 30582 1726855376.03646: when evaluation is False, skipping this task 30582 1726855376.03649: _execute() done 30582 1726855376.03651: dumping result to json 30582 1726855376.03654: done dumping result, returning 30582 1726855376.03662: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-0000000021b7] 30582 1726855376.03668: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b7 30582 1726855376.03757: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b7 30582 1726855376.03760: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855376.03842: no more pending results, returning what we have 30582 1726855376.03845: results queue empty 30582 1726855376.03846: checking for any_errors_fatal 30582 1726855376.03859: done checking for any_errors_fatal 30582 1726855376.03860: checking for max_fail_percentage 30582 1726855376.03861: done checking for max_fail_percentage 30582 1726855376.03862: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.03865: done checking to see if all hosts have failed 30582 1726855376.03866: getting the remaining hosts for this loop 30582 1726855376.03867: done getting the remaining hosts for this loop 30582 1726855376.03871: getting the next task for host managed_node3 30582 1726855376.03880: done getting next task for host managed_node3 30582 1726855376.03884: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855376.03891: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.03910: getting variables 30582 1726855376.03911: in VariableManager get_vars() 30582 1726855376.03950: Calling all_inventory to load vars for managed_node3 30582 1726855376.03952: Calling groups_inventory to load vars for managed_node3 30582 1726855376.03954: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.03962: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.03967: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.03970: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.04940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.05814: done with get_vars() 30582 1726855376.05834: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:56 -0400 (0:00:00.033) 0:01:52.409 ****** 30582 1726855376.05909: entering _queue_task() for managed_node3/ping 30582 1726855376.06180: worker is 1 (out of 1 available) 30582 1726855376.06196: exiting _queue_task() for managed_node3/ping 30582 1726855376.06208: done queuing things up, now waiting for results queue to drain 30582 1726855376.06210: waiting for pending results... 30582 1726855376.06401: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855376.06505: in run() - task 0affcc66-ac2b-aa83-7d57-0000000021b8 30582 1726855376.06518: variable 'ansible_search_path' from source: unknown 30582 1726855376.06521: variable 'ansible_search_path' from source: unknown 30582 1726855376.06553: calling self._execute() 30582 1726855376.06632: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.06635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.06645: variable 'omit' from source: magic vars 30582 1726855376.06940: variable 'ansible_distribution_major_version' from source: facts 30582 1726855376.06949: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855376.06955: variable 'omit' from source: magic vars 30582 1726855376.07007: variable 'omit' from source: magic vars 30582 1726855376.07030: variable 'omit' from source: magic vars 30582 1726855376.07068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855376.07099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855376.07114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855376.07128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855376.07138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855376.07161: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855376.07168: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.07171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.07245: Set connection var ansible_timeout to 10 30582 1726855376.07249: Set connection var ansible_connection to ssh 30582 1726855376.07254: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855376.07259: Set connection var ansible_pipelining to False 30582 1726855376.07267: Set connection var ansible_shell_executable to /bin/sh 30582 1726855376.07271: Set connection var ansible_shell_type to sh 30582 1726855376.07286: variable 'ansible_shell_executable' from source: unknown 30582 1726855376.07292: variable 'ansible_connection' from source: unknown 30582 1726855376.07295: variable 'ansible_module_compression' from source: unknown 30582 1726855376.07298: variable 'ansible_shell_type' from source: unknown 30582 1726855376.07300: variable 'ansible_shell_executable' from source: unknown 30582 1726855376.07302: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.07307: variable 'ansible_pipelining' from source: unknown 30582 1726855376.07309: variable 'ansible_timeout' from source: unknown 30582 1726855376.07311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.07465: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855376.07474: variable 'omit' from source: magic vars 30582 1726855376.07480: starting attempt loop 30582 1726855376.07482: running the handler 30582 1726855376.07497: _low_level_execute_command(): starting 30582 1726855376.07504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855376.08024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855376.08029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855376.08033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.08079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.08082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855376.08099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.08170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.09877: stdout chunk (state=3): >>>/root <<< 30582 1726855376.09969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855376.09999: stderr chunk (state=3): >>><<< 30582 1726855376.10002: stdout chunk (state=3): >>><<< 30582 1726855376.10025: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855376.10041: _low_level_execute_command(): starting 30582 1726855376.10047: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746 `" && echo ansible-tmp-1726855376.1002703-35792-51884029817746="` echo /root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746 `" ) && sleep 0' 30582 1726855376.10514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.10517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855376.10520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855376.10530: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855376.10533: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.10589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.10592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855376.10595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.10643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.12543: stdout chunk (state=3): >>>ansible-tmp-1726855376.1002703-35792-51884029817746=/root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746 <<< 30582 1726855376.12645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855376.12676: stderr chunk (state=3): >>><<< 30582 1726855376.12679: stdout chunk (state=3): >>><<< 30582 1726855376.12698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855376.1002703-35792-51884029817746=/root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855376.12744: variable 'ansible_module_compression' from source: unknown 30582 1726855376.12780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855376.12809: variable 'ansible_facts' from source: unknown 30582 1726855376.12867: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/AnsiballZ_ping.py 30582 1726855376.12969: Sending initial data 30582 1726855376.12972: Sent initial data (152 bytes) 30582 1726855376.13436: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.13440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.13442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.13444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855376.13446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.13500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.13506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.13568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.15114: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30582 1726855376.15118: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855376.15171: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855376.15229: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp__6fkux2 /root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/AnsiballZ_ping.py <<< 30582 1726855376.15232: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/AnsiballZ_ping.py" <<< 30582 1726855376.15290: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp__6fkux2" to remote "/root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/AnsiballZ_ping.py" <<< 30582 1726855376.15293: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/AnsiballZ_ping.py" <<< 30582 1726855376.15864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855376.15905: stderr chunk (state=3): >>><<< 30582 1726855376.15908: stdout chunk (state=3): >>><<< 30582 1726855376.15926: done transferring module to remote 30582 1726855376.15934: _low_level_execute_command(): starting 30582 1726855376.15938: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/ /root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/AnsiballZ_ping.py && sleep 0' 30582 1726855376.16376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855376.16379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.16381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.16385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.16435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.16439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.16507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.18251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855376.18282: stderr chunk (state=3): >>><<< 30582 1726855376.18285: stdout chunk (state=3): >>><<< 30582 1726855376.18301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855376.18304: _low_level_execute_command(): starting 30582 1726855376.18309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/AnsiballZ_ping.py && sleep 0' 30582 1726855376.18764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.18768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855376.18770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.18772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855376.18774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.18823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.18826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.18896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.33778: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855376.35049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855376.35079: stderr chunk (state=3): >>><<< 30582 1726855376.35082: stdout chunk (state=3): >>><<< 30582 1726855376.35101: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855376.35125: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855376.35134: _low_level_execute_command(): starting 30582 1726855376.35138: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855376.1002703-35792-51884029817746/ > /dev/null 2>&1 && sleep 0' 30582 1726855376.35577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.35581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855376.35595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.35612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.35615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.35667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.35670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855376.35680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.35747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.37570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855376.37598: stderr chunk (state=3): >>><<< 30582 1726855376.37601: stdout chunk (state=3): >>><<< 30582 1726855376.37617: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855376.37620: handler run complete 30582 1726855376.37634: attempt loop complete, returning result 30582 1726855376.37637: _execute() done 30582 1726855376.37639: dumping result to json 30582 1726855376.37641: done dumping result, returning 30582 1726855376.37650: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-0000000021b8] 30582 1726855376.37656: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b8 30582 1726855376.37751: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000021b8 30582 1726855376.37753: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855376.37829: no more pending results, returning what we have 30582 1726855376.37832: results queue empty 30582 1726855376.37833: checking for any_errors_fatal 30582 1726855376.37840: done checking for any_errors_fatal 30582 1726855376.37841: checking for max_fail_percentage 30582 1726855376.37843: done checking for max_fail_percentage 30582 1726855376.37844: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.37844: done checking to see if all hosts have failed 30582 1726855376.37845: getting the remaining hosts for this loop 30582 1726855376.37846: done getting the remaining hosts for this loop 30582 1726855376.37850: getting the next task for host managed_node3 30582 1726855376.37865: done getting next task for host managed_node3 30582 1726855376.37867: ^ task is: TASK: meta (role_complete) 30582 1726855376.37873: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.37885: getting variables 30582 1726855376.37889: in VariableManager get_vars() 30582 1726855376.37935: Calling all_inventory to load vars for managed_node3 30582 1726855376.37938: Calling groups_inventory to load vars for managed_node3 30582 1726855376.37940: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.37949: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.37952: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.37954: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.38786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.39685: done with get_vars() 30582 1726855376.39704: done getting variables 30582 1726855376.39768: done queuing things up, now waiting for results queue to drain 30582 1726855376.39769: results queue empty 30582 1726855376.39770: checking for any_errors_fatal 30582 1726855376.39772: done checking for any_errors_fatal 30582 1726855376.39772: checking for max_fail_percentage 30582 1726855376.39773: done checking for max_fail_percentage 30582 1726855376.39774: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.39774: done checking to see if all hosts have failed 30582 1726855376.39775: getting the remaining hosts for this loop 30582 1726855376.39775: done getting the remaining hosts for this loop 30582 1726855376.39777: getting the next task for host managed_node3 30582 1726855376.39781: done getting next task for host managed_node3 30582 1726855376.39782: ^ task is: TASK: Show result 30582 1726855376.39784: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.39786: getting variables 30582 1726855376.39788: in VariableManager get_vars() 30582 1726855376.39798: Calling all_inventory to load vars for managed_node3 30582 1726855376.39799: Calling groups_inventory to load vars for managed_node3 30582 1726855376.39801: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.39804: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.39806: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.39807: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.40552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.41434: done with get_vars() 30582 1726855376.41449: done getting variables 30582 1726855376.41483: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 14:02:56 -0400 (0:00:00.355) 0:01:52.765 ****** 30582 1726855376.41508: entering _queue_task() for managed_node3/debug 30582 1726855376.41838: worker is 1 (out of 1 available) 30582 1726855376.41854: exiting _queue_task() for managed_node3/debug 30582 1726855376.41868: done queuing things up, now waiting for results queue to drain 30582 1726855376.41870: waiting for pending results... 30582 1726855376.42055: running TaskExecutor() for managed_node3/TASK: Show result 30582 1726855376.42156: in run() - task 0affcc66-ac2b-aa83-7d57-00000000213a 30582 1726855376.42168: variable 'ansible_search_path' from source: unknown 30582 1726855376.42176: variable 'ansible_search_path' from source: unknown 30582 1726855376.42209: calling self._execute() 30582 1726855376.42283: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.42289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.42300: variable 'omit' from source: magic vars 30582 1726855376.42589: variable 'ansible_distribution_major_version' from source: facts 30582 1726855376.42599: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855376.42605: variable 'omit' from source: magic vars 30582 1726855376.42637: variable 'omit' from source: magic vars 30582 1726855376.42667: variable 'omit' from source: magic vars 30582 1726855376.42700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855376.42726: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855376.42743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855376.42757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855376.42769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855376.42792: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855376.42796: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.42798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.42875: Set connection var ansible_timeout to 10 30582 1726855376.42878: Set connection var ansible_connection to ssh 30582 1726855376.42881: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855376.42884: Set connection var ansible_pipelining to False 30582 1726855376.42886: Set connection var ansible_shell_executable to /bin/sh 30582 1726855376.42891: Set connection var ansible_shell_type to sh 30582 1726855376.42909: variable 'ansible_shell_executable' from source: unknown 30582 1726855376.42911: variable 'ansible_connection' from source: unknown 30582 1726855376.42914: variable 'ansible_module_compression' from source: unknown 30582 1726855376.42916: variable 'ansible_shell_type' from source: unknown 30582 1726855376.42919: variable 'ansible_shell_executable' from source: unknown 30582 1726855376.42921: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.42923: variable 'ansible_pipelining' from source: unknown 30582 1726855376.42927: variable 'ansible_timeout' from source: unknown 30582 1726855376.42931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.43034: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855376.43042: variable 'omit' from source: magic vars 30582 1726855376.43047: starting attempt loop 30582 1726855376.43050: running the handler 30582 1726855376.43095: variable '__network_connections_result' from source: set_fact 30582 1726855376.43150: variable '__network_connections_result' from source: set_fact 30582 1726855376.43238: handler run complete 30582 1726855376.43256: attempt loop complete, returning result 30582 1726855376.43259: _execute() done 30582 1726855376.43261: dumping result to json 30582 1726855376.43268: done dumping result, returning 30582 1726855376.43274: done running TaskExecutor() for managed_node3/TASK: Show result [0affcc66-ac2b-aa83-7d57-00000000213a] 30582 1726855376.43278: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000213a 30582 1726855376.43374: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000213a 30582 1726855376.43377: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76" ] } } 30582 1726855376.43457: no more pending results, returning what we have 30582 1726855376.43460: results queue empty 30582 1726855376.43461: checking for any_errors_fatal 30582 1726855376.43463: done checking for any_errors_fatal 30582 1726855376.43465: checking for max_fail_percentage 30582 1726855376.43468: done checking for max_fail_percentage 30582 1726855376.43469: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.43469: done checking to see if all hosts have failed 30582 1726855376.43470: getting the remaining hosts for this loop 30582 1726855376.43472: done getting the remaining hosts for this loop 30582 1726855376.43475: getting the next task for host managed_node3 30582 1726855376.43490: done getting next task for host managed_node3 30582 1726855376.43494: ^ task is: TASK: Include network role 30582 1726855376.43498: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.43502: getting variables 30582 1726855376.43504: in VariableManager get_vars() 30582 1726855376.43543: Calling all_inventory to load vars for managed_node3 30582 1726855376.43546: Calling groups_inventory to load vars for managed_node3 30582 1726855376.43549: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.43559: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.43561: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.43566: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.44398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.45408: done with get_vars() 30582 1726855376.45425: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 14:02:56 -0400 (0:00:00.039) 0:01:52.804 ****** 30582 1726855376.45502: entering _queue_task() for managed_node3/include_role 30582 1726855376.45767: worker is 1 (out of 1 available) 30582 1726855376.45782: exiting _queue_task() for managed_node3/include_role 30582 1726855376.45796: done queuing things up, now waiting for results queue to drain 30582 1726855376.45798: waiting for pending results... 30582 1726855376.45982: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855376.46091: in run() - task 0affcc66-ac2b-aa83-7d57-00000000213e 30582 1726855376.46105: variable 'ansible_search_path' from source: unknown 30582 1726855376.46108: variable 'ansible_search_path' from source: unknown 30582 1726855376.46139: calling self._execute() 30582 1726855376.46212: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.46217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.46225: variable 'omit' from source: magic vars 30582 1726855376.46512: variable 'ansible_distribution_major_version' from source: facts 30582 1726855376.46521: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855376.46527: _execute() done 30582 1726855376.46530: dumping result to json 30582 1726855376.46533: done dumping result, returning 30582 1726855376.46540: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-00000000213e] 30582 1726855376.46545: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000213e 30582 1726855376.46654: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000213e 30582 1726855376.46657: WORKER PROCESS EXITING 30582 1726855376.46700: no more pending results, returning what we have 30582 1726855376.46705: in VariableManager get_vars() 30582 1726855376.46752: Calling all_inventory to load vars for managed_node3 30582 1726855376.46755: Calling groups_inventory to load vars for managed_node3 30582 1726855376.46758: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.46773: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.46776: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.46778: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.47598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.48488: done with get_vars() 30582 1726855376.48506: variable 'ansible_search_path' from source: unknown 30582 1726855376.48507: variable 'ansible_search_path' from source: unknown 30582 1726855376.48604: variable 'omit' from source: magic vars 30582 1726855376.48633: variable 'omit' from source: magic vars 30582 1726855376.48643: variable 'omit' from source: magic vars 30582 1726855376.48646: we have included files to process 30582 1726855376.48647: generating all_blocks data 30582 1726855376.48649: done generating all_blocks data 30582 1726855376.48654: processing included file: fedora.linux_system_roles.network 30582 1726855376.48669: in VariableManager get_vars() 30582 1726855376.48681: done with get_vars() 30582 1726855376.48703: in VariableManager get_vars() 30582 1726855376.48715: done with get_vars() 30582 1726855376.48744: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855376.48818: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855376.48872: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855376.49147: in VariableManager get_vars() 30582 1726855376.49161: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855376.50402: iterating over new_blocks loaded from include file 30582 1726855376.50403: in VariableManager get_vars() 30582 1726855376.50415: done with get_vars() 30582 1726855376.50417: filtering new block on tags 30582 1726855376.50574: done filtering new block on tags 30582 1726855376.50577: in VariableManager get_vars() 30582 1726855376.50589: done with get_vars() 30582 1726855376.50590: filtering new block on tags 30582 1726855376.50603: done filtering new block on tags 30582 1726855376.50604: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855376.50608: extending task lists for all hosts with included blocks 30582 1726855376.50672: done extending task lists 30582 1726855376.50673: done processing included files 30582 1726855376.50674: results queue empty 30582 1726855376.50674: checking for any_errors_fatal 30582 1726855376.50678: done checking for any_errors_fatal 30582 1726855376.50679: checking for max_fail_percentage 30582 1726855376.50679: done checking for max_fail_percentage 30582 1726855376.50680: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.50680: done checking to see if all hosts have failed 30582 1726855376.50681: getting the remaining hosts for this loop 30582 1726855376.50682: done getting the remaining hosts for this loop 30582 1726855376.50683: getting the next task for host managed_node3 30582 1726855376.50686: done getting next task for host managed_node3 30582 1726855376.50690: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855376.50692: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.50699: getting variables 30582 1726855376.50700: in VariableManager get_vars() 30582 1726855376.50711: Calling all_inventory to load vars for managed_node3 30582 1726855376.50713: Calling groups_inventory to load vars for managed_node3 30582 1726855376.50714: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.50718: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.50719: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.50721: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.51431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.52304: done with get_vars() 30582 1726855376.52319: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:56 -0400 (0:00:00.068) 0:01:52.873 ****** 30582 1726855376.52375: entering _queue_task() for managed_node3/include_tasks 30582 1726855376.52654: worker is 1 (out of 1 available) 30582 1726855376.52670: exiting _queue_task() for managed_node3/include_tasks 30582 1726855376.52682: done queuing things up, now waiting for results queue to drain 30582 1726855376.52684: waiting for pending results... 30582 1726855376.52869: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855376.52956: in run() - task 0affcc66-ac2b-aa83-7d57-000000002328 30582 1726855376.52968: variable 'ansible_search_path' from source: unknown 30582 1726855376.52975: variable 'ansible_search_path' from source: unknown 30582 1726855376.53007: calling self._execute() 30582 1726855376.53083: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.53090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.53098: variable 'omit' from source: magic vars 30582 1726855376.53373: variable 'ansible_distribution_major_version' from source: facts 30582 1726855376.53383: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855376.53391: _execute() done 30582 1726855376.53394: dumping result to json 30582 1726855376.53397: done dumping result, returning 30582 1726855376.53405: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-000000002328] 30582 1726855376.53409: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002328 30582 1726855376.53498: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002328 30582 1726855376.53501: WORKER PROCESS EXITING 30582 1726855376.53547: no more pending results, returning what we have 30582 1726855376.53552: in VariableManager get_vars() 30582 1726855376.53605: Calling all_inventory to load vars for managed_node3 30582 1726855376.53611: Calling groups_inventory to load vars for managed_node3 30582 1726855376.53613: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.53625: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.53628: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.53631: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.54432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.55439: done with get_vars() 30582 1726855376.55454: variable 'ansible_search_path' from source: unknown 30582 1726855376.55455: variable 'ansible_search_path' from source: unknown 30582 1726855376.55483: we have included files to process 30582 1726855376.55484: generating all_blocks data 30582 1726855376.55485: done generating all_blocks data 30582 1726855376.55490: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855376.55491: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855376.55492: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855376.55873: done processing included file 30582 1726855376.55875: iterating over new_blocks loaded from include file 30582 1726855376.55876: in VariableManager get_vars() 30582 1726855376.55895: done with get_vars() 30582 1726855376.55896: filtering new block on tags 30582 1726855376.55915: done filtering new block on tags 30582 1726855376.55917: in VariableManager get_vars() 30582 1726855376.55931: done with get_vars() 30582 1726855376.55932: filtering new block on tags 30582 1726855376.55956: done filtering new block on tags 30582 1726855376.55958: in VariableManager get_vars() 30582 1726855376.55976: done with get_vars() 30582 1726855376.55977: filtering new block on tags 30582 1726855376.56004: done filtering new block on tags 30582 1726855376.56006: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855376.56009: extending task lists for all hosts with included blocks 30582 1726855376.56986: done extending task lists 30582 1726855376.56989: done processing included files 30582 1726855376.56989: results queue empty 30582 1726855376.56990: checking for any_errors_fatal 30582 1726855376.56992: done checking for any_errors_fatal 30582 1726855376.56992: checking for max_fail_percentage 30582 1726855376.56993: done checking for max_fail_percentage 30582 1726855376.56994: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.56994: done checking to see if all hosts have failed 30582 1726855376.56995: getting the remaining hosts for this loop 30582 1726855376.56996: done getting the remaining hosts for this loop 30582 1726855376.56997: getting the next task for host managed_node3 30582 1726855376.57001: done getting next task for host managed_node3 30582 1726855376.57002: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855376.57005: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.57013: getting variables 30582 1726855376.57014: in VariableManager get_vars() 30582 1726855376.57023: Calling all_inventory to load vars for managed_node3 30582 1726855376.57024: Calling groups_inventory to load vars for managed_node3 30582 1726855376.57025: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.57029: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.57031: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.57032: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.57666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.58536: done with get_vars() 30582 1726855376.58551: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:02:56 -0400 (0:00:00.062) 0:01:52.936 ****** 30582 1726855376.58608: entering _queue_task() for managed_node3/setup 30582 1726855376.58882: worker is 1 (out of 1 available) 30582 1726855376.58898: exiting _queue_task() for managed_node3/setup 30582 1726855376.58910: done queuing things up, now waiting for results queue to drain 30582 1726855376.58912: waiting for pending results... 30582 1726855376.59107: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855376.59199: in run() - task 0affcc66-ac2b-aa83-7d57-00000000237f 30582 1726855376.59212: variable 'ansible_search_path' from source: unknown 30582 1726855376.59217: variable 'ansible_search_path' from source: unknown 30582 1726855376.59245: calling self._execute() 30582 1726855376.59320: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.59325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.59332: variable 'omit' from source: magic vars 30582 1726855376.59625: variable 'ansible_distribution_major_version' from source: facts 30582 1726855376.59635: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855376.59781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855376.61427: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855376.61473: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855376.61503: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855376.61533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855376.61551: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855376.61611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855376.61631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855376.61652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855376.61681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855376.61694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855376.61731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855376.61747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855376.61773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855376.61799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855376.61810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855376.61927: variable '__network_required_facts' from source: role '' defaults 30582 1726855376.61934: variable 'ansible_facts' from source: unknown 30582 1726855376.62369: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855376.62373: when evaluation is False, skipping this task 30582 1726855376.62375: _execute() done 30582 1726855376.62378: dumping result to json 30582 1726855376.62380: done dumping result, returning 30582 1726855376.62383: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-00000000237f] 30582 1726855376.62390: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000237f 30582 1726855376.62476: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000237f 30582 1726855376.62478: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855376.62549: no more pending results, returning what we have 30582 1726855376.62553: results queue empty 30582 1726855376.62554: checking for any_errors_fatal 30582 1726855376.62556: done checking for any_errors_fatal 30582 1726855376.62557: checking for max_fail_percentage 30582 1726855376.62559: done checking for max_fail_percentage 30582 1726855376.62559: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.62560: done checking to see if all hosts have failed 30582 1726855376.62561: getting the remaining hosts for this loop 30582 1726855376.62562: done getting the remaining hosts for this loop 30582 1726855376.62569: getting the next task for host managed_node3 30582 1726855376.62580: done getting next task for host managed_node3 30582 1726855376.62583: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855376.62592: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.62617: getting variables 30582 1726855376.62619: in VariableManager get_vars() 30582 1726855376.62662: Calling all_inventory to load vars for managed_node3 30582 1726855376.62667: Calling groups_inventory to load vars for managed_node3 30582 1726855376.62669: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.62679: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.62682: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.62699: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.63626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.64528: done with get_vars() 30582 1726855376.64551: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:02:56 -0400 (0:00:00.060) 0:01:52.996 ****** 30582 1726855376.64628: entering _queue_task() for managed_node3/stat 30582 1726855376.64914: worker is 1 (out of 1 available) 30582 1726855376.64928: exiting _queue_task() for managed_node3/stat 30582 1726855376.64941: done queuing things up, now waiting for results queue to drain 30582 1726855376.64942: waiting for pending results... 30582 1726855376.65128: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855376.65231: in run() - task 0affcc66-ac2b-aa83-7d57-000000002381 30582 1726855376.65244: variable 'ansible_search_path' from source: unknown 30582 1726855376.65247: variable 'ansible_search_path' from source: unknown 30582 1726855376.65277: calling self._execute() 30582 1726855376.65347: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.65351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.65358: variable 'omit' from source: magic vars 30582 1726855376.65636: variable 'ansible_distribution_major_version' from source: facts 30582 1726855376.65645: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855376.65760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855376.65959: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855376.65994: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855376.66019: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855376.66050: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855376.66112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855376.66129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855376.66147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855376.66272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855376.66276: variable '__network_is_ostree' from source: set_fact 30582 1726855376.66278: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855376.66280: when evaluation is False, skipping this task 30582 1726855376.66282: _execute() done 30582 1726855376.66284: dumping result to json 30582 1726855376.66286: done dumping result, returning 30582 1726855376.66294: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-000000002381] 30582 1726855376.66296: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002381 30582 1726855376.66356: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002381 30582 1726855376.66358: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855376.66418: no more pending results, returning what we have 30582 1726855376.66422: results queue empty 30582 1726855376.66423: checking for any_errors_fatal 30582 1726855376.66433: done checking for any_errors_fatal 30582 1726855376.66433: checking for max_fail_percentage 30582 1726855376.66435: done checking for max_fail_percentage 30582 1726855376.66436: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.66437: done checking to see if all hosts have failed 30582 1726855376.66438: getting the remaining hosts for this loop 30582 1726855376.66440: done getting the remaining hosts for this loop 30582 1726855376.66443: getting the next task for host managed_node3 30582 1726855376.66451: done getting next task for host managed_node3 30582 1726855376.66455: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855376.66460: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.66485: getting variables 30582 1726855376.66489: in VariableManager get_vars() 30582 1726855376.66529: Calling all_inventory to load vars for managed_node3 30582 1726855376.66532: Calling groups_inventory to load vars for managed_node3 30582 1726855376.66534: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.66544: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.66546: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.66549: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.67379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.68276: done with get_vars() 30582 1726855376.68297: done getting variables 30582 1726855376.68343: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:02:56 -0400 (0:00:00.037) 0:01:53.033 ****** 30582 1726855376.68373: entering _queue_task() for managed_node3/set_fact 30582 1726855376.68644: worker is 1 (out of 1 available) 30582 1726855376.68658: exiting _queue_task() for managed_node3/set_fact 30582 1726855376.68672: done queuing things up, now waiting for results queue to drain 30582 1726855376.68674: waiting for pending results... 30582 1726855376.68866: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855376.68958: in run() - task 0affcc66-ac2b-aa83-7d57-000000002382 30582 1726855376.68971: variable 'ansible_search_path' from source: unknown 30582 1726855376.68974: variable 'ansible_search_path' from source: unknown 30582 1726855376.69006: calling self._execute() 30582 1726855376.69077: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.69081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.69091: variable 'omit' from source: magic vars 30582 1726855376.69372: variable 'ansible_distribution_major_version' from source: facts 30582 1726855376.69382: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855376.69498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855376.69695: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855376.69728: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855376.69753: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855376.69784: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855376.69845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855376.69866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855376.69883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855376.69906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855376.69973: variable '__network_is_ostree' from source: set_fact 30582 1726855376.69978: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855376.69981: when evaluation is False, skipping this task 30582 1726855376.69983: _execute() done 30582 1726855376.69988: dumping result to json 30582 1726855376.69991: done dumping result, returning 30582 1726855376.70006: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-000000002382] 30582 1726855376.70008: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002382 30582 1726855376.70092: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002382 30582 1726855376.70095: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855376.70161: no more pending results, returning what we have 30582 1726855376.70167: results queue empty 30582 1726855376.70168: checking for any_errors_fatal 30582 1726855376.70175: done checking for any_errors_fatal 30582 1726855376.70175: checking for max_fail_percentage 30582 1726855376.70178: done checking for max_fail_percentage 30582 1726855376.70178: checking to see if all hosts have failed and the running result is not ok 30582 1726855376.70179: done checking to see if all hosts have failed 30582 1726855376.70180: getting the remaining hosts for this loop 30582 1726855376.70182: done getting the remaining hosts for this loop 30582 1726855376.70185: getting the next task for host managed_node3 30582 1726855376.70198: done getting next task for host managed_node3 30582 1726855376.70201: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855376.70207: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855376.70231: getting variables 30582 1726855376.70232: in VariableManager get_vars() 30582 1726855376.70274: Calling all_inventory to load vars for managed_node3 30582 1726855376.70277: Calling groups_inventory to load vars for managed_node3 30582 1726855376.70280: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855376.70295: Calling all_plugins_play to load vars for managed_node3 30582 1726855376.70298: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855376.70301: Calling groups_plugins_play to load vars for managed_node3 30582 1726855376.71222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855376.72104: done with get_vars() 30582 1726855376.72121: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:02:56 -0400 (0:00:00.038) 0:01:53.071 ****** 30582 1726855376.72196: entering _queue_task() for managed_node3/service_facts 30582 1726855376.72458: worker is 1 (out of 1 available) 30582 1726855376.72474: exiting _queue_task() for managed_node3/service_facts 30582 1726855376.72490: done queuing things up, now waiting for results queue to drain 30582 1726855376.72492: waiting for pending results... 30582 1726855376.72673: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855376.72774: in run() - task 0affcc66-ac2b-aa83-7d57-000000002384 30582 1726855376.72786: variable 'ansible_search_path' from source: unknown 30582 1726855376.72791: variable 'ansible_search_path' from source: unknown 30582 1726855376.72818: calling self._execute() 30582 1726855376.72891: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.72895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.72902: variable 'omit' from source: magic vars 30582 1726855376.73184: variable 'ansible_distribution_major_version' from source: facts 30582 1726855376.73195: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855376.73201: variable 'omit' from source: magic vars 30582 1726855376.73253: variable 'omit' from source: magic vars 30582 1726855376.73279: variable 'omit' from source: magic vars 30582 1726855376.73311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855376.73338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855376.73354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855376.73370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855376.73383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855376.73407: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855376.73410: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.73413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.73484: Set connection var ansible_timeout to 10 30582 1726855376.73489: Set connection var ansible_connection to ssh 30582 1726855376.73492: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855376.73501: Set connection var ansible_pipelining to False 30582 1726855376.73504: Set connection var ansible_shell_executable to /bin/sh 30582 1726855376.73507: Set connection var ansible_shell_type to sh 30582 1726855376.73522: variable 'ansible_shell_executable' from source: unknown 30582 1726855376.73524: variable 'ansible_connection' from source: unknown 30582 1726855376.73527: variable 'ansible_module_compression' from source: unknown 30582 1726855376.73529: variable 'ansible_shell_type' from source: unknown 30582 1726855376.73531: variable 'ansible_shell_executable' from source: unknown 30582 1726855376.73534: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855376.73538: variable 'ansible_pipelining' from source: unknown 30582 1726855376.73540: variable 'ansible_timeout' from source: unknown 30582 1726855376.73544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855376.73685: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855376.73696: variable 'omit' from source: magic vars 30582 1726855376.73701: starting attempt loop 30582 1726855376.73703: running the handler 30582 1726855376.73720: _low_level_execute_command(): starting 30582 1726855376.73723: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855376.74239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855376.74243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.74246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855376.74249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.74292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.74307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.74383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.76085: stdout chunk (state=3): >>>/root <<< 30582 1726855376.76182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855376.76216: stderr chunk (state=3): >>><<< 30582 1726855376.76219: stdout chunk (state=3): >>><<< 30582 1726855376.76237: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855376.76249: _low_level_execute_command(): starting 30582 1726855376.76254: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206 `" && echo ansible-tmp-1726855376.7623785-35806-124941379421206="` echo /root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206 `" ) && sleep 0' 30582 1726855376.76683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855376.76686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.76794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855376.76807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.77017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.78893: stdout chunk (state=3): >>>ansible-tmp-1726855376.7623785-35806-124941379421206=/root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206 <<< 30582 1726855376.78999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855376.79024: stderr chunk (state=3): >>><<< 30582 1726855376.79027: stdout chunk (state=3): >>><<< 30582 1726855376.79044: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855376.7623785-35806-124941379421206=/root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855376.79084: variable 'ansible_module_compression' from source: unknown 30582 1726855376.79120: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855376.79193: variable 'ansible_facts' from source: unknown 30582 1726855376.79211: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/AnsiballZ_service_facts.py 30582 1726855376.79314: Sending initial data 30582 1726855376.79317: Sent initial data (162 bytes) 30582 1726855376.79744: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.79747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855376.79749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.79752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855376.79754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.79805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.79808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.79870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.81422: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30582 1726855376.81427: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855376.81477: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855376.81536: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp85lkjjdg /root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/AnsiballZ_service_facts.py <<< 30582 1726855376.81540: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/AnsiballZ_service_facts.py" <<< 30582 1726855376.81593: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp85lkjjdg" to remote "/root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/AnsiballZ_service_facts.py" <<< 30582 1726855376.81598: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/AnsiballZ_service_facts.py" <<< 30582 1726855376.82205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855376.82246: stderr chunk (state=3): >>><<< 30582 1726855376.82249: stdout chunk (state=3): >>><<< 30582 1726855376.82276: done transferring module to remote 30582 1726855376.82284: _low_level_execute_command(): starting 30582 1726855376.82290: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/ /root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/AnsiballZ_service_facts.py && sleep 0' 30582 1726855376.82722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.82725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855376.82728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855376.82730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855376.82732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.82781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.82785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.82847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855376.84670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855376.84674: stdout chunk (state=3): >>><<< 30582 1726855376.84676: stderr chunk (state=3): >>><<< 30582 1726855376.84695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855376.84703: _low_level_execute_command(): starting 30582 1726855376.84711: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/AnsiballZ_service_facts.py && sleep 0' 30582 1726855376.85298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855376.85302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.85308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855376.85325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855376.85370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855376.85374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855376.85445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855378.36711: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30582 1726855378.36730: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30582 1726855378.36755: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 30582 1726855378.36761: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30582 1726855378.36791: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855378.38275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855378.38315: stderr chunk (state=3): >>><<< 30582 1726855378.38318: stdout chunk (state=3): >>><<< 30582 1726855378.38348: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855378.38816: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855378.38826: _low_level_execute_command(): starting 30582 1726855378.38831: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855376.7623785-35806-124941379421206/ > /dev/null 2>&1 && sleep 0' 30582 1726855378.39291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855378.39295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.39297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855378.39299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.39351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855378.39354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855378.39357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855378.39417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855378.41233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855378.41260: stderr chunk (state=3): >>><<< 30582 1726855378.41265: stdout chunk (state=3): >>><<< 30582 1726855378.41275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855378.41281: handler run complete 30582 1726855378.41394: variable 'ansible_facts' from source: unknown 30582 1726855378.41494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855378.41771: variable 'ansible_facts' from source: unknown 30582 1726855378.41858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855378.41970: attempt loop complete, returning result 30582 1726855378.41976: _execute() done 30582 1726855378.41979: dumping result to json 30582 1726855378.42016: done dumping result, returning 30582 1726855378.42025: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000002384] 30582 1726855378.42030: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002384 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855378.42890: no more pending results, returning what we have 30582 1726855378.42893: results queue empty 30582 1726855378.42894: checking for any_errors_fatal 30582 1726855378.42898: done checking for any_errors_fatal 30582 1726855378.42898: checking for max_fail_percentage 30582 1726855378.42900: done checking for max_fail_percentage 30582 1726855378.42901: checking to see if all hosts have failed and the running result is not ok 30582 1726855378.42902: done checking to see if all hosts have failed 30582 1726855378.42903: getting the remaining hosts for this loop 30582 1726855378.42904: done getting the remaining hosts for this loop 30582 1726855378.42907: getting the next task for host managed_node3 30582 1726855378.42919: done getting next task for host managed_node3 30582 1726855378.42922: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855378.42927: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855378.42937: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002384 30582 1726855378.42940: WORKER PROCESS EXITING 30582 1726855378.42946: getting variables 30582 1726855378.42947: in VariableManager get_vars() 30582 1726855378.42974: Calling all_inventory to load vars for managed_node3 30582 1726855378.42976: Calling groups_inventory to load vars for managed_node3 30582 1726855378.42977: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855378.42984: Calling all_plugins_play to load vars for managed_node3 30582 1726855378.42986: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855378.42993: Calling groups_plugins_play to load vars for managed_node3 30582 1726855378.43726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855378.44602: done with get_vars() 30582 1726855378.44622: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:02:58 -0400 (0:00:01.725) 0:01:54.796 ****** 30582 1726855378.44700: entering _queue_task() for managed_node3/package_facts 30582 1726855378.44964: worker is 1 (out of 1 available) 30582 1726855378.44977: exiting _queue_task() for managed_node3/package_facts 30582 1726855378.44994: done queuing things up, now waiting for results queue to drain 30582 1726855378.44996: waiting for pending results... 30582 1726855378.45188: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855378.45292: in run() - task 0affcc66-ac2b-aa83-7d57-000000002385 30582 1726855378.45306: variable 'ansible_search_path' from source: unknown 30582 1726855378.45316: variable 'ansible_search_path' from source: unknown 30582 1726855378.45345: calling self._execute() 30582 1726855378.45420: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855378.45424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855378.45434: variable 'omit' from source: magic vars 30582 1726855378.45718: variable 'ansible_distribution_major_version' from source: facts 30582 1726855378.45727: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855378.45733: variable 'omit' from source: magic vars 30582 1726855378.45794: variable 'omit' from source: magic vars 30582 1726855378.45816: variable 'omit' from source: magic vars 30582 1726855378.45848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855378.45876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855378.45901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855378.45913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855378.45924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855378.45948: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855378.45950: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855378.45953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855378.46030: Set connection var ansible_timeout to 10 30582 1726855378.46033: Set connection var ansible_connection to ssh 30582 1726855378.46039: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855378.46043: Set connection var ansible_pipelining to False 30582 1726855378.46048: Set connection var ansible_shell_executable to /bin/sh 30582 1726855378.46050: Set connection var ansible_shell_type to sh 30582 1726855378.46069: variable 'ansible_shell_executable' from source: unknown 30582 1726855378.46072: variable 'ansible_connection' from source: unknown 30582 1726855378.46075: variable 'ansible_module_compression' from source: unknown 30582 1726855378.46077: variable 'ansible_shell_type' from source: unknown 30582 1726855378.46079: variable 'ansible_shell_executable' from source: unknown 30582 1726855378.46082: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855378.46086: variable 'ansible_pipelining' from source: unknown 30582 1726855378.46090: variable 'ansible_timeout' from source: unknown 30582 1726855378.46097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855378.46237: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855378.46246: variable 'omit' from source: magic vars 30582 1726855378.46251: starting attempt loop 30582 1726855378.46254: running the handler 30582 1726855378.46268: _low_level_execute_command(): starting 30582 1726855378.46274: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855378.46788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855378.46793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855378.46795: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855378.46798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.46848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855378.46851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855378.46853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855378.46921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855378.48609: stdout chunk (state=3): >>>/root <<< 30582 1726855378.48721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855378.48734: stderr chunk (state=3): >>><<< 30582 1726855378.48737: stdout chunk (state=3): >>><<< 30582 1726855378.48758: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855378.48771: _low_level_execute_command(): starting 30582 1726855378.48777: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852 `" && echo ansible-tmp-1726855378.4875734-35845-33860724747852="` echo /root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852 `" ) && sleep 0' 30582 1726855378.49223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855378.49226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855378.49228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.49238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855378.49240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.49286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855378.49293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855378.49354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855378.51250: stdout chunk (state=3): >>>ansible-tmp-1726855378.4875734-35845-33860724747852=/root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852 <<< 30582 1726855378.51352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855378.51382: stderr chunk (state=3): >>><<< 30582 1726855378.51385: stdout chunk (state=3): >>><<< 30582 1726855378.51400: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855378.4875734-35845-33860724747852=/root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855378.51438: variable 'ansible_module_compression' from source: unknown 30582 1726855378.51477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855378.51530: variable 'ansible_facts' from source: unknown 30582 1726855378.51647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/AnsiballZ_package_facts.py 30582 1726855378.51751: Sending initial data 30582 1726855378.51755: Sent initial data (161 bytes) 30582 1726855378.52191: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855378.52194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.52197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855378.52200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.52250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855378.52256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855378.52264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855378.52312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855378.53880: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30582 1726855378.53883: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855378.53935: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855378.53997: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppfjizhmv /root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/AnsiballZ_package_facts.py <<< 30582 1726855378.54001: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/AnsiballZ_package_facts.py" <<< 30582 1726855378.54068: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmppfjizhmv" to remote "/root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/AnsiballZ_package_facts.py" <<< 30582 1726855378.54073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/AnsiballZ_package_facts.py" <<< 30582 1726855378.55198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855378.55232: stderr chunk (state=3): >>><<< 30582 1726855378.55236: stdout chunk (state=3): >>><<< 30582 1726855378.55272: done transferring module to remote 30582 1726855378.55280: _low_level_execute_command(): starting 30582 1726855378.55284: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/ /root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/AnsiballZ_package_facts.py && sleep 0' 30582 1726855378.55707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855378.55710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855378.55712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855378.55714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855378.55720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.55757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855378.55775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855378.55837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855378.57621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855378.57645: stderr chunk (state=3): >>><<< 30582 1726855378.57649: stdout chunk (state=3): >>><<< 30582 1726855378.57662: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855378.57668: _low_level_execute_command(): starting 30582 1726855378.57671: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/AnsiballZ_package_facts.py && sleep 0' 30582 1726855378.58095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855378.58098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855378.58100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.58103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855378.58105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855378.58106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855378.58149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855378.58153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855378.58221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855379.02222: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30582 1726855379.02258: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30582 1726855379.02263: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30582 1726855379.02302: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30582 1726855379.02314: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30582 1726855379.02360: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30582 1726855379.02372: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30582 1726855379.02381: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30582 1726855379.02407: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855379.04210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855379.04239: stderr chunk (state=3): >>><<< 30582 1726855379.04243: stdout chunk (state=3): >>><<< 30582 1726855379.04283: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855379.05601: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855379.05619: _low_level_execute_command(): starting 30582 1726855379.05622: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855378.4875734-35845-33860724747852/ > /dev/null 2>&1 && sleep 0' 30582 1726855379.06081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855379.06084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.06086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855379.06091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855379.06094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.06142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855379.06145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855379.06147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855379.06215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855379.08090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855379.08117: stderr chunk (state=3): >>><<< 30582 1726855379.08120: stdout chunk (state=3): >>><<< 30582 1726855379.08134: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855379.08140: handler run complete 30582 1726855379.08598: variable 'ansible_facts' from source: unknown 30582 1726855379.08876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.09968: variable 'ansible_facts' from source: unknown 30582 1726855379.10208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.10582: attempt loop complete, returning result 30582 1726855379.10592: _execute() done 30582 1726855379.10595: dumping result to json 30582 1726855379.10709: done dumping result, returning 30582 1726855379.10718: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000002385] 30582 1726855379.10727: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002385 30582 1726855379.12013: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002385 30582 1726855379.12017: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855379.12111: no more pending results, returning what we have 30582 1726855379.12114: results queue empty 30582 1726855379.12114: checking for any_errors_fatal 30582 1726855379.12117: done checking for any_errors_fatal 30582 1726855379.12118: checking for max_fail_percentage 30582 1726855379.12119: done checking for max_fail_percentage 30582 1726855379.12120: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.12120: done checking to see if all hosts have failed 30582 1726855379.12121: getting the remaining hosts for this loop 30582 1726855379.12121: done getting the remaining hosts for this loop 30582 1726855379.12124: getting the next task for host managed_node3 30582 1726855379.12130: done getting next task for host managed_node3 30582 1726855379.12132: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855379.12137: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.12146: getting variables 30582 1726855379.12147: in VariableManager get_vars() 30582 1726855379.12174: Calling all_inventory to load vars for managed_node3 30582 1726855379.12176: Calling groups_inventory to load vars for managed_node3 30582 1726855379.12178: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.12185: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.12189: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.12191: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.12913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.13791: done with get_vars() 30582 1726855379.13811: done getting variables 30582 1726855379.13855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:59 -0400 (0:00:00.691) 0:01:55.488 ****** 30582 1726855379.13883: entering _queue_task() for managed_node3/debug 30582 1726855379.14125: worker is 1 (out of 1 available) 30582 1726855379.14141: exiting _queue_task() for managed_node3/debug 30582 1726855379.14153: done queuing things up, now waiting for results queue to drain 30582 1726855379.14155: waiting for pending results... 30582 1726855379.14335: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855379.14421: in run() - task 0affcc66-ac2b-aa83-7d57-000000002329 30582 1726855379.14433: variable 'ansible_search_path' from source: unknown 30582 1726855379.14437: variable 'ansible_search_path' from source: unknown 30582 1726855379.14469: calling self._execute() 30582 1726855379.14540: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.14544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.14552: variable 'omit' from source: magic vars 30582 1726855379.14832: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.14842: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.14848: variable 'omit' from source: magic vars 30582 1726855379.14898: variable 'omit' from source: magic vars 30582 1726855379.14967: variable 'network_provider' from source: set_fact 30582 1726855379.14979: variable 'omit' from source: magic vars 30582 1726855379.15013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855379.15041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855379.15059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855379.15073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855379.15083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855379.15109: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855379.15112: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.15115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.15190: Set connection var ansible_timeout to 10 30582 1726855379.15193: Set connection var ansible_connection to ssh 30582 1726855379.15198: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855379.15204: Set connection var ansible_pipelining to False 30582 1726855379.15209: Set connection var ansible_shell_executable to /bin/sh 30582 1726855379.15211: Set connection var ansible_shell_type to sh 30582 1726855379.15228: variable 'ansible_shell_executable' from source: unknown 30582 1726855379.15231: variable 'ansible_connection' from source: unknown 30582 1726855379.15233: variable 'ansible_module_compression' from source: unknown 30582 1726855379.15236: variable 'ansible_shell_type' from source: unknown 30582 1726855379.15238: variable 'ansible_shell_executable' from source: unknown 30582 1726855379.15240: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.15242: variable 'ansible_pipelining' from source: unknown 30582 1726855379.15245: variable 'ansible_timeout' from source: unknown 30582 1726855379.15249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.15349: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855379.15359: variable 'omit' from source: magic vars 30582 1726855379.15363: starting attempt loop 30582 1726855379.15368: running the handler 30582 1726855379.15404: handler run complete 30582 1726855379.15414: attempt loop complete, returning result 30582 1726855379.15417: _execute() done 30582 1726855379.15420: dumping result to json 30582 1726855379.15422: done dumping result, returning 30582 1726855379.15429: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-000000002329] 30582 1726855379.15434: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002329 30582 1726855379.15518: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002329 30582 1726855379.15521: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855379.15589: no more pending results, returning what we have 30582 1726855379.15594: results queue empty 30582 1726855379.15595: checking for any_errors_fatal 30582 1726855379.15604: done checking for any_errors_fatal 30582 1726855379.15604: checking for max_fail_percentage 30582 1726855379.15606: done checking for max_fail_percentage 30582 1726855379.15607: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.15608: done checking to see if all hosts have failed 30582 1726855379.15608: getting the remaining hosts for this loop 30582 1726855379.15610: done getting the remaining hosts for this loop 30582 1726855379.15613: getting the next task for host managed_node3 30582 1726855379.15620: done getting next task for host managed_node3 30582 1726855379.15624: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855379.15629: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.15643: getting variables 30582 1726855379.15645: in VariableManager get_vars() 30582 1726855379.15684: Calling all_inventory to load vars for managed_node3 30582 1726855379.15691: Calling groups_inventory to load vars for managed_node3 30582 1726855379.15694: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.15702: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.15704: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.15706: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.16471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.17433: done with get_vars() 30582 1726855379.17449: done getting variables 30582 1726855379.17494: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:59 -0400 (0:00:00.036) 0:01:55.525 ****** 30582 1726855379.17523: entering _queue_task() for managed_node3/fail 30582 1726855379.17746: worker is 1 (out of 1 available) 30582 1726855379.17761: exiting _queue_task() for managed_node3/fail 30582 1726855379.17777: done queuing things up, now waiting for results queue to drain 30582 1726855379.17778: waiting for pending results... 30582 1726855379.17951: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855379.18046: in run() - task 0affcc66-ac2b-aa83-7d57-00000000232a 30582 1726855379.18057: variable 'ansible_search_path' from source: unknown 30582 1726855379.18060: variable 'ansible_search_path' from source: unknown 30582 1726855379.18091: calling self._execute() 30582 1726855379.18160: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.18167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.18172: variable 'omit' from source: magic vars 30582 1726855379.18433: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.18443: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.18527: variable 'network_state' from source: role '' defaults 30582 1726855379.18536: Evaluated conditional (network_state != {}): False 30582 1726855379.18539: when evaluation is False, skipping this task 30582 1726855379.18541: _execute() done 30582 1726855379.18546: dumping result to json 30582 1726855379.18548: done dumping result, returning 30582 1726855379.18556: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-00000000232a] 30582 1726855379.18559: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232a 30582 1726855379.18646: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232a 30582 1726855379.18649: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855379.18706: no more pending results, returning what we have 30582 1726855379.18710: results queue empty 30582 1726855379.18711: checking for any_errors_fatal 30582 1726855379.18716: done checking for any_errors_fatal 30582 1726855379.18717: checking for max_fail_percentage 30582 1726855379.18719: done checking for max_fail_percentage 30582 1726855379.18719: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.18720: done checking to see if all hosts have failed 30582 1726855379.18721: getting the remaining hosts for this loop 30582 1726855379.18722: done getting the remaining hosts for this loop 30582 1726855379.18726: getting the next task for host managed_node3 30582 1726855379.18733: done getting next task for host managed_node3 30582 1726855379.18736: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855379.18741: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.18761: getting variables 30582 1726855379.18765: in VariableManager get_vars() 30582 1726855379.18802: Calling all_inventory to load vars for managed_node3 30582 1726855379.18804: Calling groups_inventory to load vars for managed_node3 30582 1726855379.18806: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.18815: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.18817: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.18819: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.24262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.25124: done with get_vars() 30582 1726855379.25145: done getting variables 30582 1726855379.25185: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:59 -0400 (0:00:00.076) 0:01:55.602 ****** 30582 1726855379.25210: entering _queue_task() for managed_node3/fail 30582 1726855379.25500: worker is 1 (out of 1 available) 30582 1726855379.25515: exiting _queue_task() for managed_node3/fail 30582 1726855379.25527: done queuing things up, now waiting for results queue to drain 30582 1726855379.25530: waiting for pending results... 30582 1726855379.25716: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855379.25822: in run() - task 0affcc66-ac2b-aa83-7d57-00000000232b 30582 1726855379.25834: variable 'ansible_search_path' from source: unknown 30582 1726855379.25839: variable 'ansible_search_path' from source: unknown 30582 1726855379.25871: calling self._execute() 30582 1726855379.25953: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.25958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.25972: variable 'omit' from source: magic vars 30582 1726855379.26274: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.26283: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.26378: variable 'network_state' from source: role '' defaults 30582 1726855379.26386: Evaluated conditional (network_state != {}): False 30582 1726855379.26391: when evaluation is False, skipping this task 30582 1726855379.26395: _execute() done 30582 1726855379.26399: dumping result to json 30582 1726855379.26402: done dumping result, returning 30582 1726855379.26406: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-00000000232b] 30582 1726855379.26412: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232b 30582 1726855379.26508: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232b 30582 1726855379.26510: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855379.26568: no more pending results, returning what we have 30582 1726855379.26572: results queue empty 30582 1726855379.26573: checking for any_errors_fatal 30582 1726855379.26586: done checking for any_errors_fatal 30582 1726855379.26589: checking for max_fail_percentage 30582 1726855379.26591: done checking for max_fail_percentage 30582 1726855379.26592: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.26593: done checking to see if all hosts have failed 30582 1726855379.26593: getting the remaining hosts for this loop 30582 1726855379.26596: done getting the remaining hosts for this loop 30582 1726855379.26599: getting the next task for host managed_node3 30582 1726855379.26609: done getting next task for host managed_node3 30582 1726855379.26613: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855379.26619: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.26645: getting variables 30582 1726855379.26646: in VariableManager get_vars() 30582 1726855379.26696: Calling all_inventory to load vars for managed_node3 30582 1726855379.26698: Calling groups_inventory to load vars for managed_node3 30582 1726855379.26700: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.26712: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.26714: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.26717: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.27517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.28499: done with get_vars() 30582 1726855379.28514: done getting variables 30582 1726855379.28557: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:59 -0400 (0:00:00.033) 0:01:55.635 ****** 30582 1726855379.28585: entering _queue_task() for managed_node3/fail 30582 1726855379.28840: worker is 1 (out of 1 available) 30582 1726855379.28855: exiting _queue_task() for managed_node3/fail 30582 1726855379.28871: done queuing things up, now waiting for results queue to drain 30582 1726855379.28872: waiting for pending results... 30582 1726855379.29060: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855379.29167: in run() - task 0affcc66-ac2b-aa83-7d57-00000000232c 30582 1726855379.29175: variable 'ansible_search_path' from source: unknown 30582 1726855379.29179: variable 'ansible_search_path' from source: unknown 30582 1726855379.29213: calling self._execute() 30582 1726855379.29283: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.29291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.29299: variable 'omit' from source: magic vars 30582 1726855379.29576: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.29585: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.29706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855379.31228: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855379.31283: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855379.31314: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855379.31339: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855379.31358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855379.31421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.31442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.31459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.31491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.31500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.31573: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.31589: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855379.31668: variable 'ansible_distribution' from source: facts 30582 1726855379.31672: variable '__network_rh_distros' from source: role '' defaults 30582 1726855379.31678: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855379.31835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.31852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.31869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.31897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.31907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.31942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.31958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.31976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.32002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.32012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.32043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.32058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.32076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.32101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.32111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.32302: variable 'network_connections' from source: include params 30582 1726855379.32310: variable 'interface' from source: play vars 30582 1726855379.32355: variable 'interface' from source: play vars 30582 1726855379.32367: variable 'network_state' from source: role '' defaults 30582 1726855379.32412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855379.32530: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855379.32558: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855379.32581: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855379.32609: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855379.32639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855379.32656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855379.32680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.32704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855379.32723: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855379.32726: when evaluation is False, skipping this task 30582 1726855379.32729: _execute() done 30582 1726855379.32731: dumping result to json 30582 1726855379.32733: done dumping result, returning 30582 1726855379.32741: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-00000000232c] 30582 1726855379.32746: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232c 30582 1726855379.32834: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232c 30582 1726855379.32838: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855379.32881: no more pending results, returning what we have 30582 1726855379.32885: results queue empty 30582 1726855379.32886: checking for any_errors_fatal 30582 1726855379.32894: done checking for any_errors_fatal 30582 1726855379.32894: checking for max_fail_percentage 30582 1726855379.32896: done checking for max_fail_percentage 30582 1726855379.32897: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.32898: done checking to see if all hosts have failed 30582 1726855379.32899: getting the remaining hosts for this loop 30582 1726855379.32900: done getting the remaining hosts for this loop 30582 1726855379.32904: getting the next task for host managed_node3 30582 1726855379.32911: done getting next task for host managed_node3 30582 1726855379.32915: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855379.32920: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.32944: getting variables 30582 1726855379.32946: in VariableManager get_vars() 30582 1726855379.32996: Calling all_inventory to load vars for managed_node3 30582 1726855379.33000: Calling groups_inventory to load vars for managed_node3 30582 1726855379.33002: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.33012: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.33014: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.33017: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.33872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.34750: done with get_vars() 30582 1726855379.34768: done getting variables 30582 1726855379.34812: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:59 -0400 (0:00:00.062) 0:01:55.698 ****** 30582 1726855379.34838: entering _queue_task() for managed_node3/dnf 30582 1726855379.35090: worker is 1 (out of 1 available) 30582 1726855379.35105: exiting _queue_task() for managed_node3/dnf 30582 1726855379.35117: done queuing things up, now waiting for results queue to drain 30582 1726855379.35119: waiting for pending results... 30582 1726855379.35308: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855379.35413: in run() - task 0affcc66-ac2b-aa83-7d57-00000000232d 30582 1726855379.35424: variable 'ansible_search_path' from source: unknown 30582 1726855379.35427: variable 'ansible_search_path' from source: unknown 30582 1726855379.35457: calling self._execute() 30582 1726855379.35531: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.35538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.35546: variable 'omit' from source: magic vars 30582 1726855379.35826: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.35835: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.35973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855379.37537: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855379.37893: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855379.37921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855379.37946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855379.37967: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855379.38029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.38049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.38071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.38099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.38111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.38199: variable 'ansible_distribution' from source: facts 30582 1726855379.38202: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.38215: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855379.38295: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855379.38378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.38398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.38415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.38440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.38450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.38480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.38500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.38516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.38540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.38550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.38579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.38598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.38616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.38639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.38649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.38752: variable 'network_connections' from source: include params 30582 1726855379.38761: variable 'interface' from source: play vars 30582 1726855379.38811: variable 'interface' from source: play vars 30582 1726855379.38862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855379.38975: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855379.39003: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855379.39026: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855379.39049: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855379.39096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855379.39113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855379.39134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.39153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855379.39192: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855379.39343: variable 'network_connections' from source: include params 30582 1726855379.39347: variable 'interface' from source: play vars 30582 1726855379.39397: variable 'interface' from source: play vars 30582 1726855379.39415: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855379.39419: when evaluation is False, skipping this task 30582 1726855379.39421: _execute() done 30582 1726855379.39423: dumping result to json 30582 1726855379.39426: done dumping result, returning 30582 1726855379.39434: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000232d] 30582 1726855379.39439: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232d 30582 1726855379.39529: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232d 30582 1726855379.39532: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855379.39579: no more pending results, returning what we have 30582 1726855379.39583: results queue empty 30582 1726855379.39584: checking for any_errors_fatal 30582 1726855379.39592: done checking for any_errors_fatal 30582 1726855379.39593: checking for max_fail_percentage 30582 1726855379.39595: done checking for max_fail_percentage 30582 1726855379.39596: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.39597: done checking to see if all hosts have failed 30582 1726855379.39597: getting the remaining hosts for this loop 30582 1726855379.39599: done getting the remaining hosts for this loop 30582 1726855379.39603: getting the next task for host managed_node3 30582 1726855379.39612: done getting next task for host managed_node3 30582 1726855379.39615: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855379.39620: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.39644: getting variables 30582 1726855379.39645: in VariableManager get_vars() 30582 1726855379.39696: Calling all_inventory to load vars for managed_node3 30582 1726855379.39700: Calling groups_inventory to load vars for managed_node3 30582 1726855379.39702: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.39712: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.39715: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.39717: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.40740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.41599: done with get_vars() 30582 1726855379.41616: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855379.41671: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:59 -0400 (0:00:00.068) 0:01:55.766 ****** 30582 1726855379.41698: entering _queue_task() for managed_node3/yum 30582 1726855379.41962: worker is 1 (out of 1 available) 30582 1726855379.41978: exiting _queue_task() for managed_node3/yum 30582 1726855379.41990: done queuing things up, now waiting for results queue to drain 30582 1726855379.41992: waiting for pending results... 30582 1726855379.42178: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855379.42290: in run() - task 0affcc66-ac2b-aa83-7d57-00000000232e 30582 1726855379.42301: variable 'ansible_search_path' from source: unknown 30582 1726855379.42305: variable 'ansible_search_path' from source: unknown 30582 1726855379.42335: calling self._execute() 30582 1726855379.42409: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.42413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.42421: variable 'omit' from source: magic vars 30582 1726855379.42712: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.42721: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.42843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855379.44430: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855379.44486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855379.44518: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855379.44545: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855379.44566: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855379.44631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.44650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.44671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.44700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.44716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.44789: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.44803: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855379.44806: when evaluation is False, skipping this task 30582 1726855379.44808: _execute() done 30582 1726855379.44811: dumping result to json 30582 1726855379.44813: done dumping result, returning 30582 1726855379.44825: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000232e] 30582 1726855379.44828: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232e 30582 1726855379.44919: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232e 30582 1726855379.44922: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855379.44976: no more pending results, returning what we have 30582 1726855379.44981: results queue empty 30582 1726855379.44982: checking for any_errors_fatal 30582 1726855379.44990: done checking for any_errors_fatal 30582 1726855379.44990: checking for max_fail_percentage 30582 1726855379.44992: done checking for max_fail_percentage 30582 1726855379.44993: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.44994: done checking to see if all hosts have failed 30582 1726855379.44994: getting the remaining hosts for this loop 30582 1726855379.44996: done getting the remaining hosts for this loop 30582 1726855379.45000: getting the next task for host managed_node3 30582 1726855379.45008: done getting next task for host managed_node3 30582 1726855379.45013: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855379.45019: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.45045: getting variables 30582 1726855379.45046: in VariableManager get_vars() 30582 1726855379.45097: Calling all_inventory to load vars for managed_node3 30582 1726855379.45100: Calling groups_inventory to load vars for managed_node3 30582 1726855379.45102: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.45112: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.45114: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.45116: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.45963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.46851: done with get_vars() 30582 1726855379.46871: done getting variables 30582 1726855379.46917: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:59 -0400 (0:00:00.052) 0:01:55.819 ****** 30582 1726855379.46944: entering _queue_task() for managed_node3/fail 30582 1726855379.47208: worker is 1 (out of 1 available) 30582 1726855379.47223: exiting _queue_task() for managed_node3/fail 30582 1726855379.47236: done queuing things up, now waiting for results queue to drain 30582 1726855379.47237: waiting for pending results... 30582 1726855379.47435: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855379.47543: in run() - task 0affcc66-ac2b-aa83-7d57-00000000232f 30582 1726855379.47554: variable 'ansible_search_path' from source: unknown 30582 1726855379.47557: variable 'ansible_search_path' from source: unknown 30582 1726855379.47591: calling self._execute() 30582 1726855379.47670: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.47676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.47679: variable 'omit' from source: magic vars 30582 1726855379.47959: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.47967: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.48069: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855379.48198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855379.49979: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855379.50026: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855379.50085: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855379.50098: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855379.50119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855379.50176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.50201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.50219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.50244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.50255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.50291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.50309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.50325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.50350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.50360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.50390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.50409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.50425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.50448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.50459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.50574: variable 'network_connections' from source: include params 30582 1726855379.50585: variable 'interface' from source: play vars 30582 1726855379.50639: variable 'interface' from source: play vars 30582 1726855379.50691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855379.50798: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855379.50827: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855379.50853: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855379.50884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855379.50915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855379.50930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855379.50952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.50969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855379.51008: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855379.51171: variable 'network_connections' from source: include params 30582 1726855379.51174: variable 'interface' from source: play vars 30582 1726855379.51215: variable 'interface' from source: play vars 30582 1726855379.51233: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855379.51237: when evaluation is False, skipping this task 30582 1726855379.51240: _execute() done 30582 1726855379.51242: dumping result to json 30582 1726855379.51244: done dumping result, returning 30582 1726855379.51252: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000232f] 30582 1726855379.51257: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232f 30582 1726855379.51349: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000232f 30582 1726855379.51351: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855379.51430: no more pending results, returning what we have 30582 1726855379.51433: results queue empty 30582 1726855379.51434: checking for any_errors_fatal 30582 1726855379.51441: done checking for any_errors_fatal 30582 1726855379.51441: checking for max_fail_percentage 30582 1726855379.51443: done checking for max_fail_percentage 30582 1726855379.51444: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.51445: done checking to see if all hosts have failed 30582 1726855379.51445: getting the remaining hosts for this loop 30582 1726855379.51447: done getting the remaining hosts for this loop 30582 1726855379.51451: getting the next task for host managed_node3 30582 1726855379.51459: done getting next task for host managed_node3 30582 1726855379.51465: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855379.51470: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.51495: getting variables 30582 1726855379.51497: in VariableManager get_vars() 30582 1726855379.51539: Calling all_inventory to load vars for managed_node3 30582 1726855379.51541: Calling groups_inventory to load vars for managed_node3 30582 1726855379.51543: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.51552: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.51555: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.51557: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.52519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.53393: done with get_vars() 30582 1726855379.53409: done getting variables 30582 1726855379.53454: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:59 -0400 (0:00:00.065) 0:01:55.884 ****** 30582 1726855379.53484: entering _queue_task() for managed_node3/package 30582 1726855379.53752: worker is 1 (out of 1 available) 30582 1726855379.53769: exiting _queue_task() for managed_node3/package 30582 1726855379.53782: done queuing things up, now waiting for results queue to drain 30582 1726855379.53784: waiting for pending results... 30582 1726855379.53974: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855379.54081: in run() - task 0affcc66-ac2b-aa83-7d57-000000002330 30582 1726855379.54095: variable 'ansible_search_path' from source: unknown 30582 1726855379.54099: variable 'ansible_search_path' from source: unknown 30582 1726855379.54129: calling self._execute() 30582 1726855379.54201: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.54205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.54214: variable 'omit' from source: magic vars 30582 1726855379.54500: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.54508: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.54647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855379.54848: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855379.54885: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855379.54914: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855379.54972: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855379.55059: variable 'network_packages' from source: role '' defaults 30582 1726855379.55137: variable '__network_provider_setup' from source: role '' defaults 30582 1726855379.55146: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855379.55193: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855379.55201: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855379.55245: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855379.55360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855379.56738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855379.56782: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855379.56811: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855379.56837: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855379.56857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855379.56925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.56947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.56964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.56994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.57005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.57035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.57055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.57074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.57100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.57110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.57264: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855379.57335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.57351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.57369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.57401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.57412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.57475: variable 'ansible_python' from source: facts 30582 1726855379.57493: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855379.57548: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855379.57606: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855379.57685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.57704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.57722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.57746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.57756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.57790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.57813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.57827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.57852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.57863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.57958: variable 'network_connections' from source: include params 30582 1726855379.57962: variable 'interface' from source: play vars 30582 1726855379.58034: variable 'interface' from source: play vars 30582 1726855379.58083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855379.58104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855379.58124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.58148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855379.58184: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855379.58366: variable 'network_connections' from source: include params 30582 1726855379.58369: variable 'interface' from source: play vars 30582 1726855379.58439: variable 'interface' from source: play vars 30582 1726855379.58469: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855379.58517: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855379.58714: variable 'network_connections' from source: include params 30582 1726855379.58717: variable 'interface' from source: play vars 30582 1726855379.58762: variable 'interface' from source: play vars 30582 1726855379.58779: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855379.58838: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855379.59030: variable 'network_connections' from source: include params 30582 1726855379.59033: variable 'interface' from source: play vars 30582 1726855379.59079: variable 'interface' from source: play vars 30582 1726855379.59121: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855379.59158: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855379.59166: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855379.59207: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855379.59341: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855379.59632: variable 'network_connections' from source: include params 30582 1726855379.59636: variable 'interface' from source: play vars 30582 1726855379.59679: variable 'interface' from source: play vars 30582 1726855379.59686: variable 'ansible_distribution' from source: facts 30582 1726855379.59690: variable '__network_rh_distros' from source: role '' defaults 30582 1726855379.59696: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.59707: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855379.59814: variable 'ansible_distribution' from source: facts 30582 1726855379.59818: variable '__network_rh_distros' from source: role '' defaults 30582 1726855379.59822: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.59833: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855379.59942: variable 'ansible_distribution' from source: facts 30582 1726855379.59945: variable '__network_rh_distros' from source: role '' defaults 30582 1726855379.59950: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.59976: variable 'network_provider' from source: set_fact 30582 1726855379.59989: variable 'ansible_facts' from source: unknown 30582 1726855379.60582: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855379.60585: when evaluation is False, skipping this task 30582 1726855379.60590: _execute() done 30582 1726855379.60592: dumping result to json 30582 1726855379.60594: done dumping result, returning 30582 1726855379.60601: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-000000002330] 30582 1726855379.60605: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002330 30582 1726855379.60698: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002330 30582 1726855379.60701: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855379.60750: no more pending results, returning what we have 30582 1726855379.60753: results queue empty 30582 1726855379.60754: checking for any_errors_fatal 30582 1726855379.60762: done checking for any_errors_fatal 30582 1726855379.60763: checking for max_fail_percentage 30582 1726855379.60765: done checking for max_fail_percentage 30582 1726855379.60766: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.60767: done checking to see if all hosts have failed 30582 1726855379.60767: getting the remaining hosts for this loop 30582 1726855379.60769: done getting the remaining hosts for this loop 30582 1726855379.60773: getting the next task for host managed_node3 30582 1726855379.60780: done getting next task for host managed_node3 30582 1726855379.60784: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855379.60791: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.60815: getting variables 30582 1726855379.60817: in VariableManager get_vars() 30582 1726855379.60866: Calling all_inventory to load vars for managed_node3 30582 1726855379.60869: Calling groups_inventory to load vars for managed_node3 30582 1726855379.60871: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.60881: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.60883: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.60886: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.61721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.62729: done with get_vars() 30582 1726855379.62746: done getting variables 30582 1726855379.62791: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:59 -0400 (0:00:00.093) 0:01:55.978 ****** 30582 1726855379.62817: entering _queue_task() for managed_node3/package 30582 1726855379.63071: worker is 1 (out of 1 available) 30582 1726855379.63085: exiting _queue_task() for managed_node3/package 30582 1726855379.63098: done queuing things up, now waiting for results queue to drain 30582 1726855379.63100: waiting for pending results... 30582 1726855379.63285: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855379.63390: in run() - task 0affcc66-ac2b-aa83-7d57-000000002331 30582 1726855379.63401: variable 'ansible_search_path' from source: unknown 30582 1726855379.63404: variable 'ansible_search_path' from source: unknown 30582 1726855379.63437: calling self._execute() 30582 1726855379.63511: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.63515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.63523: variable 'omit' from source: magic vars 30582 1726855379.63807: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.63817: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.63904: variable 'network_state' from source: role '' defaults 30582 1726855379.63913: Evaluated conditional (network_state != {}): False 30582 1726855379.63916: when evaluation is False, skipping this task 30582 1726855379.63918: _execute() done 30582 1726855379.63921: dumping result to json 30582 1726855379.63923: done dumping result, returning 30582 1726855379.63931: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000002331] 30582 1726855379.63937: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002331 30582 1726855379.64030: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002331 30582 1726855379.64033: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855379.64080: no more pending results, returning what we have 30582 1726855379.64084: results queue empty 30582 1726855379.64085: checking for any_errors_fatal 30582 1726855379.64093: done checking for any_errors_fatal 30582 1726855379.64093: checking for max_fail_percentage 30582 1726855379.64096: done checking for max_fail_percentage 30582 1726855379.64097: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.64098: done checking to see if all hosts have failed 30582 1726855379.64099: getting the remaining hosts for this loop 30582 1726855379.64100: done getting the remaining hosts for this loop 30582 1726855379.64103: getting the next task for host managed_node3 30582 1726855379.64112: done getting next task for host managed_node3 30582 1726855379.64116: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855379.64121: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.64147: getting variables 30582 1726855379.64149: in VariableManager get_vars() 30582 1726855379.64189: Calling all_inventory to load vars for managed_node3 30582 1726855379.64191: Calling groups_inventory to load vars for managed_node3 30582 1726855379.64194: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.64204: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.64207: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.64209: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.65000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.65880: done with get_vars() 30582 1726855379.65901: done getting variables 30582 1726855379.65945: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:59 -0400 (0:00:00.031) 0:01:56.009 ****** 30582 1726855379.65971: entering _queue_task() for managed_node3/package 30582 1726855379.66225: worker is 1 (out of 1 available) 30582 1726855379.66238: exiting _queue_task() for managed_node3/package 30582 1726855379.66250: done queuing things up, now waiting for results queue to drain 30582 1726855379.66253: waiting for pending results... 30582 1726855379.66443: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855379.66538: in run() - task 0affcc66-ac2b-aa83-7d57-000000002332 30582 1726855379.66549: variable 'ansible_search_path' from source: unknown 30582 1726855379.66554: variable 'ansible_search_path' from source: unknown 30582 1726855379.66586: calling self._execute() 30582 1726855379.66661: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.66666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.66675: variable 'omit' from source: magic vars 30582 1726855379.66960: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.66972: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.67055: variable 'network_state' from source: role '' defaults 30582 1726855379.67064: Evaluated conditional (network_state != {}): False 30582 1726855379.67070: when evaluation is False, skipping this task 30582 1726855379.67072: _execute() done 30582 1726855379.67075: dumping result to json 30582 1726855379.67078: done dumping result, returning 30582 1726855379.67086: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-000000002332] 30582 1726855379.67093: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002332 30582 1726855379.67183: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002332 30582 1726855379.67186: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855379.67234: no more pending results, returning what we have 30582 1726855379.67238: results queue empty 30582 1726855379.67239: checking for any_errors_fatal 30582 1726855379.67246: done checking for any_errors_fatal 30582 1726855379.67247: checking for max_fail_percentage 30582 1726855379.67249: done checking for max_fail_percentage 30582 1726855379.67250: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.67250: done checking to see if all hosts have failed 30582 1726855379.67251: getting the remaining hosts for this loop 30582 1726855379.67253: done getting the remaining hosts for this loop 30582 1726855379.67256: getting the next task for host managed_node3 30582 1726855379.67264: done getting next task for host managed_node3 30582 1726855379.67268: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855379.67273: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.67301: getting variables 30582 1726855379.67303: in VariableManager get_vars() 30582 1726855379.67342: Calling all_inventory to load vars for managed_node3 30582 1726855379.67345: Calling groups_inventory to load vars for managed_node3 30582 1726855379.67347: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.67356: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.67359: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.67362: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.68311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.69171: done with get_vars() 30582 1726855379.69190: done getting variables 30582 1726855379.69233: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:59 -0400 (0:00:00.032) 0:01:56.042 ****** 30582 1726855379.69259: entering _queue_task() for managed_node3/service 30582 1726855379.69512: worker is 1 (out of 1 available) 30582 1726855379.69526: exiting _queue_task() for managed_node3/service 30582 1726855379.69538: done queuing things up, now waiting for results queue to drain 30582 1726855379.69539: waiting for pending results... 30582 1726855379.69731: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855379.69839: in run() - task 0affcc66-ac2b-aa83-7d57-000000002333 30582 1726855379.69849: variable 'ansible_search_path' from source: unknown 30582 1726855379.69854: variable 'ansible_search_path' from source: unknown 30582 1726855379.69885: calling self._execute() 30582 1726855379.69956: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.69960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.69971: variable 'omit' from source: magic vars 30582 1726855379.70247: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.70257: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.70350: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855379.70485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855379.72040: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855379.72102: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855379.72129: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855379.72156: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855379.72178: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855379.72237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.72261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.72283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.72311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.72322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.72354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.72377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.72395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.72419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.72430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.72457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.72476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.72497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.72521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.72531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.72658: variable 'network_connections' from source: include params 30582 1726855379.72671: variable 'interface' from source: play vars 30582 1726855379.72726: variable 'interface' from source: play vars 30582 1726855379.72779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855379.72892: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855379.72929: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855379.72952: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855379.72976: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855379.73009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855379.73026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855379.73046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.73064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855379.73106: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855379.73263: variable 'network_connections' from source: include params 30582 1726855379.73270: variable 'interface' from source: play vars 30582 1726855379.73315: variable 'interface' from source: play vars 30582 1726855379.73334: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855379.73338: when evaluation is False, skipping this task 30582 1726855379.73340: _execute() done 30582 1726855379.73343: dumping result to json 30582 1726855379.73345: done dumping result, returning 30582 1726855379.73355: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000002333] 30582 1726855379.73358: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002333 30582 1726855379.73457: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002333 30582 1726855379.73468: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855379.73515: no more pending results, returning what we have 30582 1726855379.73518: results queue empty 30582 1726855379.73519: checking for any_errors_fatal 30582 1726855379.73529: done checking for any_errors_fatal 30582 1726855379.73529: checking for max_fail_percentage 30582 1726855379.73531: done checking for max_fail_percentage 30582 1726855379.73532: checking to see if all hosts have failed and the running result is not ok 30582 1726855379.73533: done checking to see if all hosts have failed 30582 1726855379.73533: getting the remaining hosts for this loop 30582 1726855379.73535: done getting the remaining hosts for this loop 30582 1726855379.73538: getting the next task for host managed_node3 30582 1726855379.73547: done getting next task for host managed_node3 30582 1726855379.73551: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855379.73556: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855379.73580: getting variables 30582 1726855379.73582: in VariableManager get_vars() 30582 1726855379.73634: Calling all_inventory to load vars for managed_node3 30582 1726855379.73637: Calling groups_inventory to load vars for managed_node3 30582 1726855379.73639: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855379.73648: Calling all_plugins_play to load vars for managed_node3 30582 1726855379.73650: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855379.73653: Calling groups_plugins_play to load vars for managed_node3 30582 1726855379.74503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855379.75384: done with get_vars() 30582 1726855379.75405: done getting variables 30582 1726855379.75452: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:59 -0400 (0:00:00.062) 0:01:56.104 ****** 30582 1726855379.75478: entering _queue_task() for managed_node3/service 30582 1726855379.75744: worker is 1 (out of 1 available) 30582 1726855379.75756: exiting _queue_task() for managed_node3/service 30582 1726855379.75767: done queuing things up, now waiting for results queue to drain 30582 1726855379.75769: waiting for pending results... 30582 1726855379.75969: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855379.76072: in run() - task 0affcc66-ac2b-aa83-7d57-000000002334 30582 1726855379.76082: variable 'ansible_search_path' from source: unknown 30582 1726855379.76085: variable 'ansible_search_path' from source: unknown 30582 1726855379.76118: calling self._execute() 30582 1726855379.76196: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.76200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.76213: variable 'omit' from source: magic vars 30582 1726855379.76491: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.76501: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855379.76616: variable 'network_provider' from source: set_fact 30582 1726855379.76620: variable 'network_state' from source: role '' defaults 30582 1726855379.76629: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855379.76635: variable 'omit' from source: magic vars 30582 1726855379.76677: variable 'omit' from source: magic vars 30582 1726855379.76697: variable 'network_service_name' from source: role '' defaults 30582 1726855379.76745: variable 'network_service_name' from source: role '' defaults 30582 1726855379.76818: variable '__network_provider_setup' from source: role '' defaults 30582 1726855379.76821: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855379.76869: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855379.76875: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855379.76920: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855379.77067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855379.78820: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855379.78869: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855379.78907: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855379.78935: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855379.78955: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855379.79015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.79038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.79055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.79082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.79094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.79125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.79144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.79160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.79185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.79198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.79340: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855379.79423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.79439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.79457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.79485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.79497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.79557: variable 'ansible_python' from source: facts 30582 1726855379.79575: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855379.79629: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855379.79683: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855379.79769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.79788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.79893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.79896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.79898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.79914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855379.79950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855379.79982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.80027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855379.80047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855379.80186: variable 'network_connections' from source: include params 30582 1726855379.80202: variable 'interface' from source: play vars 30582 1726855379.80283: variable 'interface' from source: play vars 30582 1726855379.80398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855379.80592: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855379.80665: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855379.80893: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855379.80896: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855379.80899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855379.80901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855379.80909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855379.80948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855379.81015: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855379.81311: variable 'network_connections' from source: include params 30582 1726855379.81322: variable 'interface' from source: play vars 30582 1726855379.81398: variable 'interface' from source: play vars 30582 1726855379.81435: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855379.81515: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855379.81702: variable 'network_connections' from source: include params 30582 1726855379.81705: variable 'interface' from source: play vars 30582 1726855379.81753: variable 'interface' from source: play vars 30582 1726855379.81772: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855379.81829: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855379.82016: variable 'network_connections' from source: include params 30582 1726855379.82020: variable 'interface' from source: play vars 30582 1726855379.82066: variable 'interface' from source: play vars 30582 1726855379.82105: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855379.82149: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855379.82155: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855379.82199: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855379.82338: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855379.82642: variable 'network_connections' from source: include params 30582 1726855379.82645: variable 'interface' from source: play vars 30582 1726855379.82693: variable 'interface' from source: play vars 30582 1726855379.82700: variable 'ansible_distribution' from source: facts 30582 1726855379.82702: variable '__network_rh_distros' from source: role '' defaults 30582 1726855379.82709: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.82720: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855379.82832: variable 'ansible_distribution' from source: facts 30582 1726855379.82835: variable '__network_rh_distros' from source: role '' defaults 30582 1726855379.82840: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.82851: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855379.82963: variable 'ansible_distribution' from source: facts 30582 1726855379.82969: variable '__network_rh_distros' from source: role '' defaults 30582 1726855379.82974: variable 'ansible_distribution_major_version' from source: facts 30582 1726855379.83005: variable 'network_provider' from source: set_fact 30582 1726855379.83022: variable 'omit' from source: magic vars 30582 1726855379.83043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855379.83065: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855379.83081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855379.83097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855379.83106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855379.83128: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855379.83131: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.83133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.83204: Set connection var ansible_timeout to 10 30582 1726855379.83207: Set connection var ansible_connection to ssh 30582 1726855379.83214: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855379.83219: Set connection var ansible_pipelining to False 30582 1726855379.83223: Set connection var ansible_shell_executable to /bin/sh 30582 1726855379.83226: Set connection var ansible_shell_type to sh 30582 1726855379.83244: variable 'ansible_shell_executable' from source: unknown 30582 1726855379.83246: variable 'ansible_connection' from source: unknown 30582 1726855379.83249: variable 'ansible_module_compression' from source: unknown 30582 1726855379.83251: variable 'ansible_shell_type' from source: unknown 30582 1726855379.83253: variable 'ansible_shell_executable' from source: unknown 30582 1726855379.83255: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855379.83260: variable 'ansible_pipelining' from source: unknown 30582 1726855379.83262: variable 'ansible_timeout' from source: unknown 30582 1726855379.83268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855379.83343: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855379.83351: variable 'omit' from source: magic vars 30582 1726855379.83357: starting attempt loop 30582 1726855379.83360: running the handler 30582 1726855379.83421: variable 'ansible_facts' from source: unknown 30582 1726855379.83841: _low_level_execute_command(): starting 30582 1726855379.83845: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855379.84343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855379.84347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.84350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855379.84352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.84392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855379.84405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855379.84484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855379.86194: stdout chunk (state=3): >>>/root <<< 30582 1726855379.86300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855379.86327: stderr chunk (state=3): >>><<< 30582 1726855379.86337: stdout chunk (state=3): >>><<< 30582 1726855379.86356: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855379.86367: _low_level_execute_command(): starting 30582 1726855379.86375: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540 `" && echo ansible-tmp-1726855379.8635578-35867-125266484372540="` echo /root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540 `" ) && sleep 0' 30582 1726855379.86816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855379.86819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855379.86822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855379.86824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855379.86826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.86875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855379.86878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855379.86945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855379.88886: stdout chunk (state=3): >>>ansible-tmp-1726855379.8635578-35867-125266484372540=/root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540 <<< 30582 1726855379.88992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855379.89019: stderr chunk (state=3): >>><<< 30582 1726855379.89022: stdout chunk (state=3): >>><<< 30582 1726855379.89036: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855379.8635578-35867-125266484372540=/root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855379.89065: variable 'ansible_module_compression' from source: unknown 30582 1726855379.89107: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855379.89157: variable 'ansible_facts' from source: unknown 30582 1726855379.89292: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/AnsiballZ_systemd.py 30582 1726855379.89394: Sending initial data 30582 1726855379.89398: Sent initial data (156 bytes) 30582 1726855379.89841: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855379.89844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855379.89851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.89854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855379.89856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.89903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855379.89906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855379.89976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855379.91571: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855379.91578: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855379.91625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855379.91692: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmprcwsbt0_ /root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/AnsiballZ_systemd.py <<< 30582 1726855379.91696: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/AnsiballZ_systemd.py" <<< 30582 1726855379.91746: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmprcwsbt0_" to remote "/root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/AnsiballZ_systemd.py" <<< 30582 1726855379.91752: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/AnsiballZ_systemd.py" <<< 30582 1726855379.92874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855379.92920: stderr chunk (state=3): >>><<< 30582 1726855379.92923: stdout chunk (state=3): >>><<< 30582 1726855379.92960: done transferring module to remote 30582 1726855379.92971: _low_level_execute_command(): starting 30582 1726855379.92975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/ /root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/AnsiballZ_systemd.py && sleep 0' 30582 1726855379.93409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855379.93413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855379.93415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.93417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855379.93419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.93476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855379.93482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855379.93483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855379.93539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855379.95361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855379.95388: stderr chunk (state=3): >>><<< 30582 1726855379.95392: stdout chunk (state=3): >>><<< 30582 1726855379.95404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855379.95407: _low_level_execute_command(): starting 30582 1726855379.95411: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/AnsiballZ_systemd.py && sleep 0' 30582 1726855379.95826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855379.95830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.95840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855379.95893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855379.95910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855379.95971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855380.25315: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10674176", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314638848", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2310125000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855380.25323: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855380.27180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855380.27210: stderr chunk (state=3): >>><<< 30582 1726855380.27214: stdout chunk (state=3): >>><<< 30582 1726855380.27228: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10674176", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314638848", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2310125000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855380.27354: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855380.27374: _low_level_execute_command(): starting 30582 1726855380.27377: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855379.8635578-35867-125266484372540/ > /dev/null 2>&1 && sleep 0' 30582 1726855380.27827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855380.27830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855380.27833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.27835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855380.27837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.27893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855380.27896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855380.27903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855380.27960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855380.29780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855380.29808: stderr chunk (state=3): >>><<< 30582 1726855380.29811: stdout chunk (state=3): >>><<< 30582 1726855380.29822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855380.29831: handler run complete 30582 1726855380.29870: attempt loop complete, returning result 30582 1726855380.29873: _execute() done 30582 1726855380.29875: dumping result to json 30582 1726855380.29889: done dumping result, returning 30582 1726855380.29898: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-000000002334] 30582 1726855380.29903: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002334 30582 1726855380.30141: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002334 30582 1726855380.30144: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855380.30207: no more pending results, returning what we have 30582 1726855380.30211: results queue empty 30582 1726855380.30212: checking for any_errors_fatal 30582 1726855380.30218: done checking for any_errors_fatal 30582 1726855380.30219: checking for max_fail_percentage 30582 1726855380.30220: done checking for max_fail_percentage 30582 1726855380.30221: checking to see if all hosts have failed and the running result is not ok 30582 1726855380.30222: done checking to see if all hosts have failed 30582 1726855380.30223: getting the remaining hosts for this loop 30582 1726855380.30224: done getting the remaining hosts for this loop 30582 1726855380.30228: getting the next task for host managed_node3 30582 1726855380.30235: done getting next task for host managed_node3 30582 1726855380.30238: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855380.30243: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855380.30256: getting variables 30582 1726855380.30258: in VariableManager get_vars() 30582 1726855380.30298: Calling all_inventory to load vars for managed_node3 30582 1726855380.30301: Calling groups_inventory to load vars for managed_node3 30582 1726855380.30303: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855380.30313: Calling all_plugins_play to load vars for managed_node3 30582 1726855380.30316: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855380.30319: Calling groups_plugins_play to load vars for managed_node3 30582 1726855380.31299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855380.32164: done with get_vars() 30582 1726855380.32180: done getting variables 30582 1726855380.32225: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:03:00 -0400 (0:00:00.567) 0:01:56.672 ****** 30582 1726855380.32255: entering _queue_task() for managed_node3/service 30582 1726855380.32498: worker is 1 (out of 1 available) 30582 1726855380.32511: exiting _queue_task() for managed_node3/service 30582 1726855380.32523: done queuing things up, now waiting for results queue to drain 30582 1726855380.32526: waiting for pending results... 30582 1726855380.32711: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855380.32819: in run() - task 0affcc66-ac2b-aa83-7d57-000000002335 30582 1726855380.32830: variable 'ansible_search_path' from source: unknown 30582 1726855380.32833: variable 'ansible_search_path' from source: unknown 30582 1726855380.32862: calling self._execute() 30582 1726855380.32933: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855380.32936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855380.32945: variable 'omit' from source: magic vars 30582 1726855380.33223: variable 'ansible_distribution_major_version' from source: facts 30582 1726855380.33232: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855380.33316: variable 'network_provider' from source: set_fact 30582 1726855380.33319: Evaluated conditional (network_provider == "nm"): True 30582 1726855380.33386: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855380.33453: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855380.33571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855380.35036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855380.35086: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855380.35115: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855380.35141: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855380.35162: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855380.35234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855380.35257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855380.35277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855380.35304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855380.35315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855380.35347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855380.35364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855380.35384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855380.35410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855380.35420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855380.35448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855380.35465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855380.35485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855380.35510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855380.35521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855380.35626: variable 'network_connections' from source: include params 30582 1726855380.35636: variable 'interface' from source: play vars 30582 1726855380.35688: variable 'interface' from source: play vars 30582 1726855380.35738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855380.35848: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855380.35878: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855380.35903: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855380.35926: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855380.35955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855380.35973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855380.35991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855380.36009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855380.36047: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855380.36206: variable 'network_connections' from source: include params 30582 1726855380.36210: variable 'interface' from source: play vars 30582 1726855380.36256: variable 'interface' from source: play vars 30582 1726855380.36281: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855380.36284: when evaluation is False, skipping this task 30582 1726855380.36289: _execute() done 30582 1726855380.36291: dumping result to json 30582 1726855380.36293: done dumping result, returning 30582 1726855380.36301: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-000000002335] 30582 1726855380.36313: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002335 30582 1726855380.36398: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002335 30582 1726855380.36401: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855380.36447: no more pending results, returning what we have 30582 1726855380.36450: results queue empty 30582 1726855380.36451: checking for any_errors_fatal 30582 1726855380.36478: done checking for any_errors_fatal 30582 1726855380.36479: checking for max_fail_percentage 30582 1726855380.36481: done checking for max_fail_percentage 30582 1726855380.36482: checking to see if all hosts have failed and the running result is not ok 30582 1726855380.36482: done checking to see if all hosts have failed 30582 1726855380.36483: getting the remaining hosts for this loop 30582 1726855380.36485: done getting the remaining hosts for this loop 30582 1726855380.36490: getting the next task for host managed_node3 30582 1726855380.36500: done getting next task for host managed_node3 30582 1726855380.36504: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855380.36509: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855380.36534: getting variables 30582 1726855380.36535: in VariableManager get_vars() 30582 1726855380.36580: Calling all_inventory to load vars for managed_node3 30582 1726855380.36583: Calling groups_inventory to load vars for managed_node3 30582 1726855380.36585: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855380.36600: Calling all_plugins_play to load vars for managed_node3 30582 1726855380.36603: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855380.36605: Calling groups_plugins_play to load vars for managed_node3 30582 1726855380.37428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855380.38431: done with get_vars() 30582 1726855380.38448: done getting variables 30582 1726855380.38493: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:03:00 -0400 (0:00:00.062) 0:01:56.735 ****** 30582 1726855380.38517: entering _queue_task() for managed_node3/service 30582 1726855380.38767: worker is 1 (out of 1 available) 30582 1726855380.38780: exiting _queue_task() for managed_node3/service 30582 1726855380.38794: done queuing things up, now waiting for results queue to drain 30582 1726855380.38796: waiting for pending results... 30582 1726855380.38980: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855380.39074: in run() - task 0affcc66-ac2b-aa83-7d57-000000002336 30582 1726855380.39088: variable 'ansible_search_path' from source: unknown 30582 1726855380.39092: variable 'ansible_search_path' from source: unknown 30582 1726855380.39120: calling self._execute() 30582 1726855380.39202: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855380.39206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855380.39214: variable 'omit' from source: magic vars 30582 1726855380.39501: variable 'ansible_distribution_major_version' from source: facts 30582 1726855380.39510: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855380.39594: variable 'network_provider' from source: set_fact 30582 1726855380.39599: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855380.39602: when evaluation is False, skipping this task 30582 1726855380.39604: _execute() done 30582 1726855380.39607: dumping result to json 30582 1726855380.39611: done dumping result, returning 30582 1726855380.39618: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-000000002336] 30582 1726855380.39623: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002336 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855380.39758: no more pending results, returning what we have 30582 1726855380.39762: results queue empty 30582 1726855380.39763: checking for any_errors_fatal 30582 1726855380.39773: done checking for any_errors_fatal 30582 1726855380.39773: checking for max_fail_percentage 30582 1726855380.39775: done checking for max_fail_percentage 30582 1726855380.39777: checking to see if all hosts have failed and the running result is not ok 30582 1726855380.39777: done checking to see if all hosts have failed 30582 1726855380.39778: getting the remaining hosts for this loop 30582 1726855380.39780: done getting the remaining hosts for this loop 30582 1726855380.39784: getting the next task for host managed_node3 30582 1726855380.39794: done getting next task for host managed_node3 30582 1726855380.39799: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855380.39805: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855380.39830: getting variables 30582 1726855380.39832: in VariableManager get_vars() 30582 1726855380.39872: Calling all_inventory to load vars for managed_node3 30582 1726855380.39875: Calling groups_inventory to load vars for managed_node3 30582 1726855380.39877: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855380.39892: Calling all_plugins_play to load vars for managed_node3 30582 1726855380.39895: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855380.39900: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002336 30582 1726855380.39903: WORKER PROCESS EXITING 30582 1726855380.39906: Calling groups_plugins_play to load vars for managed_node3 30582 1726855380.40684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855380.41555: done with get_vars() 30582 1726855380.41574: done getting variables 30582 1726855380.41620: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:03:00 -0400 (0:00:00.031) 0:01:56.766 ****** 30582 1726855380.41648: entering _queue_task() for managed_node3/copy 30582 1726855380.41912: worker is 1 (out of 1 available) 30582 1726855380.41926: exiting _queue_task() for managed_node3/copy 30582 1726855380.41938: done queuing things up, now waiting for results queue to drain 30582 1726855380.41940: waiting for pending results... 30582 1726855380.42134: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855380.42237: in run() - task 0affcc66-ac2b-aa83-7d57-000000002337 30582 1726855380.42248: variable 'ansible_search_path' from source: unknown 30582 1726855380.42253: variable 'ansible_search_path' from source: unknown 30582 1726855380.42285: calling self._execute() 30582 1726855380.42360: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855380.42363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855380.42375: variable 'omit' from source: magic vars 30582 1726855380.42650: variable 'ansible_distribution_major_version' from source: facts 30582 1726855380.42660: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855380.42744: variable 'network_provider' from source: set_fact 30582 1726855380.42748: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855380.42751: when evaluation is False, skipping this task 30582 1726855380.42754: _execute() done 30582 1726855380.42756: dumping result to json 30582 1726855380.42758: done dumping result, returning 30582 1726855380.42771: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-000000002337] 30582 1726855380.42774: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002337 30582 1726855380.42861: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002337 30582 1726855380.42865: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855380.42913: no more pending results, returning what we have 30582 1726855380.42917: results queue empty 30582 1726855380.42918: checking for any_errors_fatal 30582 1726855380.42928: done checking for any_errors_fatal 30582 1726855380.42929: checking for max_fail_percentage 30582 1726855380.42930: done checking for max_fail_percentage 30582 1726855380.42932: checking to see if all hosts have failed and the running result is not ok 30582 1726855380.42933: done checking to see if all hosts have failed 30582 1726855380.42933: getting the remaining hosts for this loop 30582 1726855380.42935: done getting the remaining hosts for this loop 30582 1726855380.42938: getting the next task for host managed_node3 30582 1726855380.42947: done getting next task for host managed_node3 30582 1726855380.42951: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855380.42956: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855380.42983: getting variables 30582 1726855380.42984: in VariableManager get_vars() 30582 1726855380.43028: Calling all_inventory to load vars for managed_node3 30582 1726855380.43031: Calling groups_inventory to load vars for managed_node3 30582 1726855380.43033: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855380.43043: Calling all_plugins_play to load vars for managed_node3 30582 1726855380.43046: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855380.43048: Calling groups_plugins_play to load vars for managed_node3 30582 1726855380.44005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855380.44864: done with get_vars() 30582 1726855380.44883: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:03:00 -0400 (0:00:00.032) 0:01:56.799 ****** 30582 1726855380.44948: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855380.45207: worker is 1 (out of 1 available) 30582 1726855380.45222: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855380.45234: done queuing things up, now waiting for results queue to drain 30582 1726855380.45236: waiting for pending results... 30582 1726855380.45425: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855380.45518: in run() - task 0affcc66-ac2b-aa83-7d57-000000002338 30582 1726855380.45530: variable 'ansible_search_path' from source: unknown 30582 1726855380.45533: variable 'ansible_search_path' from source: unknown 30582 1726855380.45561: calling self._execute() 30582 1726855380.45645: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855380.45649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855380.45658: variable 'omit' from source: magic vars 30582 1726855380.45942: variable 'ansible_distribution_major_version' from source: facts 30582 1726855380.45952: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855380.45958: variable 'omit' from source: magic vars 30582 1726855380.46005: variable 'omit' from source: magic vars 30582 1726855380.46123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855380.47598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855380.47649: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855380.47679: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855380.47706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855380.47728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855380.47791: variable 'network_provider' from source: set_fact 30582 1726855380.47889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855380.47908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855380.47925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855380.47951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855380.47969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855380.48022: variable 'omit' from source: magic vars 30582 1726855380.48102: variable 'omit' from source: magic vars 30582 1726855380.48172: variable 'network_connections' from source: include params 30582 1726855380.48184: variable 'interface' from source: play vars 30582 1726855380.48230: variable 'interface' from source: play vars 30582 1726855380.48336: variable 'omit' from source: magic vars 30582 1726855380.48343: variable '__lsr_ansible_managed' from source: task vars 30582 1726855380.48391: variable '__lsr_ansible_managed' from source: task vars 30582 1726855380.48527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855380.48669: Loaded config def from plugin (lookup/template) 30582 1726855380.48674: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855380.48694: File lookup term: get_ansible_managed.j2 30582 1726855380.48697: variable 'ansible_search_path' from source: unknown 30582 1726855380.48700: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855380.48712: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855380.48727: variable 'ansible_search_path' from source: unknown 30582 1726855380.52168: variable 'ansible_managed' from source: unknown 30582 1726855380.52249: variable 'omit' from source: magic vars 30582 1726855380.52272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855380.52296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855380.52310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855380.52323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855380.52331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855380.52351: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855380.52354: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855380.52357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855380.52425: Set connection var ansible_timeout to 10 30582 1726855380.52428: Set connection var ansible_connection to ssh 30582 1726855380.52434: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855380.52438: Set connection var ansible_pipelining to False 30582 1726855380.52443: Set connection var ansible_shell_executable to /bin/sh 30582 1726855380.52445: Set connection var ansible_shell_type to sh 30582 1726855380.52465: variable 'ansible_shell_executable' from source: unknown 30582 1726855380.52468: variable 'ansible_connection' from source: unknown 30582 1726855380.52471: variable 'ansible_module_compression' from source: unknown 30582 1726855380.52473: variable 'ansible_shell_type' from source: unknown 30582 1726855380.52475: variable 'ansible_shell_executable' from source: unknown 30582 1726855380.52477: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855380.52479: variable 'ansible_pipelining' from source: unknown 30582 1726855380.52481: variable 'ansible_timeout' from source: unknown 30582 1726855380.52483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855380.52574: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855380.52586: variable 'omit' from source: magic vars 30582 1726855380.52591: starting attempt loop 30582 1726855380.52594: running the handler 30582 1726855380.52607: _low_level_execute_command(): starting 30582 1726855380.52615: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855380.53107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855380.53111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855380.53114: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855380.53116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.53165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855380.53170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855380.53172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855380.53246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855380.54999: stdout chunk (state=3): >>>/root <<< 30582 1726855380.55099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855380.55129: stderr chunk (state=3): >>><<< 30582 1726855380.55133: stdout chunk (state=3): >>><<< 30582 1726855380.55151: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855380.55162: _low_level_execute_command(): starting 30582 1726855380.55170: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873 `" && echo ansible-tmp-1726855380.551518-35881-126170924666873="` echo /root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873 `" ) && sleep 0' 30582 1726855380.55612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855380.55616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855380.55618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.55620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855380.55622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.55677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855380.55684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855380.55685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855380.55745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855380.57747: stdout chunk (state=3): >>>ansible-tmp-1726855380.551518-35881-126170924666873=/root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873 <<< 30582 1726855380.57846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855380.57876: stderr chunk (state=3): >>><<< 30582 1726855380.57879: stdout chunk (state=3): >>><<< 30582 1726855380.57899: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855380.551518-35881-126170924666873=/root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855380.57939: variable 'ansible_module_compression' from source: unknown 30582 1726855380.57977: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855380.58021: variable 'ansible_facts' from source: unknown 30582 1726855380.58115: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/AnsiballZ_network_connections.py 30582 1726855380.58212: Sending initial data 30582 1726855380.58215: Sent initial data (167 bytes) 30582 1726855380.58671: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855380.58674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855380.58681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.58684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855380.58686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.58733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855380.58736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855380.58742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855380.58805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855380.60495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855380.60552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855380.60615: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp8gogg5kh /root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/AnsiballZ_network_connections.py <<< 30582 1726855380.60623: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/AnsiballZ_network_connections.py" <<< 30582 1726855380.60672: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp8gogg5kh" to remote "/root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/AnsiballZ_network_connections.py" <<< 30582 1726855380.60679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/AnsiballZ_network_connections.py" <<< 30582 1726855380.61452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855380.61497: stderr chunk (state=3): >>><<< 30582 1726855380.61500: stdout chunk (state=3): >>><<< 30582 1726855380.61544: done transferring module to remote 30582 1726855380.61554: _low_level_execute_command(): starting 30582 1726855380.61558: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/ /root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/AnsiballZ_network_connections.py && sleep 0' 30582 1726855380.61993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855380.61997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855380.62005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.62018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.62070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855380.62073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855380.62079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855380.62138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855380.64028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855380.64052: stderr chunk (state=3): >>><<< 30582 1726855380.64055: stdout chunk (state=3): >>><<< 30582 1726855380.64097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855380.64100: _low_level_execute_command(): starting 30582 1726855380.64103: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/AnsiballZ_network_connections.py && sleep 0' 30582 1726855380.64531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855380.64534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.64537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855380.64539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.64591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855380.64600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855380.64673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855380.90362: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855380.92130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855380.92161: stderr chunk (state=3): >>><<< 30582 1726855380.92165: stdout chunk (state=3): >>><<< 30582 1726855380.92186: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855380.92216: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855380.92224: _low_level_execute_command(): starting 30582 1726855380.92229: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855380.551518-35881-126170924666873/ > /dev/null 2>&1 && sleep 0' 30582 1726855380.92684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855380.92690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855380.92692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.92696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855380.92700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855380.92745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855380.92749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855380.92751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855380.92817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855380.94644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855380.94674: stderr chunk (state=3): >>><<< 30582 1726855380.94676: stdout chunk (state=3): >>><<< 30582 1726855380.94691: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855380.94695: handler run complete 30582 1726855380.94715: attempt loop complete, returning result 30582 1726855380.94717: _execute() done 30582 1726855380.94720: dumping result to json 30582 1726855380.94724: done dumping result, returning 30582 1726855380.94734: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-000000002338] 30582 1726855380.94736: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002338 30582 1726855380.94837: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002338 30582 1726855380.94846: WORKER PROCESS EXITING ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76 skipped because already active 30582 1726855380.94944: no more pending results, returning what we have 30582 1726855380.94947: results queue empty 30582 1726855380.94948: checking for any_errors_fatal 30582 1726855380.94954: done checking for any_errors_fatal 30582 1726855380.94956: checking for max_fail_percentage 30582 1726855380.94958: done checking for max_fail_percentage 30582 1726855380.94959: checking to see if all hosts have failed and the running result is not ok 30582 1726855380.94960: done checking to see if all hosts have failed 30582 1726855380.94960: getting the remaining hosts for this loop 30582 1726855380.94962: done getting the remaining hosts for this loop 30582 1726855380.94967: getting the next task for host managed_node3 30582 1726855380.94975: done getting next task for host managed_node3 30582 1726855380.94978: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855380.94983: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855380.94998: getting variables 30582 1726855380.94999: in VariableManager get_vars() 30582 1726855380.95040: Calling all_inventory to load vars for managed_node3 30582 1726855380.95043: Calling groups_inventory to load vars for managed_node3 30582 1726855380.95045: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855380.95055: Calling all_plugins_play to load vars for managed_node3 30582 1726855380.95058: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855380.95060: Calling groups_plugins_play to load vars for managed_node3 30582 1726855380.95934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855380.96810: done with get_vars() 30582 1726855380.96829: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:03:00 -0400 (0:00:00.519) 0:01:57.318 ****** 30582 1726855380.96896: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855380.97139: worker is 1 (out of 1 available) 30582 1726855380.97152: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855380.97168: done queuing things up, now waiting for results queue to drain 30582 1726855380.97170: waiting for pending results... 30582 1726855380.97350: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855380.97461: in run() - task 0affcc66-ac2b-aa83-7d57-000000002339 30582 1726855380.97474: variable 'ansible_search_path' from source: unknown 30582 1726855380.97478: variable 'ansible_search_path' from source: unknown 30582 1726855380.97511: calling self._execute() 30582 1726855380.97580: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855380.97583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855380.97593: variable 'omit' from source: magic vars 30582 1726855380.97870: variable 'ansible_distribution_major_version' from source: facts 30582 1726855380.97877: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855380.97961: variable 'network_state' from source: role '' defaults 30582 1726855380.97970: Evaluated conditional (network_state != {}): False 30582 1726855380.97974: when evaluation is False, skipping this task 30582 1726855380.97977: _execute() done 30582 1726855380.97980: dumping result to json 30582 1726855380.97982: done dumping result, returning 30582 1726855380.97991: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-000000002339] 30582 1726855380.97996: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002339 30582 1726855380.98086: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002339 30582 1726855380.98091: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855380.98142: no more pending results, returning what we have 30582 1726855380.98146: results queue empty 30582 1726855380.98147: checking for any_errors_fatal 30582 1726855380.98157: done checking for any_errors_fatal 30582 1726855380.98158: checking for max_fail_percentage 30582 1726855380.98160: done checking for max_fail_percentage 30582 1726855380.98161: checking to see if all hosts have failed and the running result is not ok 30582 1726855380.98162: done checking to see if all hosts have failed 30582 1726855380.98162: getting the remaining hosts for this loop 30582 1726855380.98166: done getting the remaining hosts for this loop 30582 1726855380.98170: getting the next task for host managed_node3 30582 1726855380.98177: done getting next task for host managed_node3 30582 1726855380.98181: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855380.98186: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855380.98212: getting variables 30582 1726855380.98214: in VariableManager get_vars() 30582 1726855380.98252: Calling all_inventory to load vars for managed_node3 30582 1726855380.98255: Calling groups_inventory to load vars for managed_node3 30582 1726855380.98257: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855380.98268: Calling all_plugins_play to load vars for managed_node3 30582 1726855380.98270: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855380.98273: Calling groups_plugins_play to load vars for managed_node3 30582 1726855380.99204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.00075: done with get_vars() 30582 1726855381.00094: done getting variables 30582 1726855381.00140: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:03:01 -0400 (0:00:00.032) 0:01:57.351 ****** 30582 1726855381.00167: entering _queue_task() for managed_node3/debug 30582 1726855381.00421: worker is 1 (out of 1 available) 30582 1726855381.00434: exiting _queue_task() for managed_node3/debug 30582 1726855381.00445: done queuing things up, now waiting for results queue to drain 30582 1726855381.00447: waiting for pending results... 30582 1726855381.00630: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855381.00732: in run() - task 0affcc66-ac2b-aa83-7d57-00000000233a 30582 1726855381.00744: variable 'ansible_search_path' from source: unknown 30582 1726855381.00747: variable 'ansible_search_path' from source: unknown 30582 1726855381.00778: calling self._execute() 30582 1726855381.00853: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.00856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.00868: variable 'omit' from source: magic vars 30582 1726855381.01134: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.01143: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.01149: variable 'omit' from source: magic vars 30582 1726855381.01195: variable 'omit' from source: magic vars 30582 1726855381.01222: variable 'omit' from source: magic vars 30582 1726855381.01250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855381.01276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855381.01295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855381.01308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855381.01319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855381.01344: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855381.01348: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.01350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.01421: Set connection var ansible_timeout to 10 30582 1726855381.01424: Set connection var ansible_connection to ssh 30582 1726855381.01430: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855381.01439: Set connection var ansible_pipelining to False 30582 1726855381.01442: Set connection var ansible_shell_executable to /bin/sh 30582 1726855381.01444: Set connection var ansible_shell_type to sh 30582 1726855381.01461: variable 'ansible_shell_executable' from source: unknown 30582 1726855381.01466: variable 'ansible_connection' from source: unknown 30582 1726855381.01469: variable 'ansible_module_compression' from source: unknown 30582 1726855381.01471: variable 'ansible_shell_type' from source: unknown 30582 1726855381.01474: variable 'ansible_shell_executable' from source: unknown 30582 1726855381.01476: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.01478: variable 'ansible_pipelining' from source: unknown 30582 1726855381.01480: variable 'ansible_timeout' from source: unknown 30582 1726855381.01482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.01581: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855381.01590: variable 'omit' from source: magic vars 30582 1726855381.01597: starting attempt loop 30582 1726855381.01599: running the handler 30582 1726855381.01691: variable '__network_connections_result' from source: set_fact 30582 1726855381.01730: handler run complete 30582 1726855381.01743: attempt loop complete, returning result 30582 1726855381.01745: _execute() done 30582 1726855381.01748: dumping result to json 30582 1726855381.01750: done dumping result, returning 30582 1726855381.01760: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-00000000233a] 30582 1726855381.01766: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000233a 30582 1726855381.01847: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000233a 30582 1726855381.01849: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76 skipped because already active" ] } 30582 1726855381.01939: no more pending results, returning what we have 30582 1726855381.01943: results queue empty 30582 1726855381.01944: checking for any_errors_fatal 30582 1726855381.01949: done checking for any_errors_fatal 30582 1726855381.01950: checking for max_fail_percentage 30582 1726855381.01951: done checking for max_fail_percentage 30582 1726855381.01952: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.01953: done checking to see if all hosts have failed 30582 1726855381.01953: getting the remaining hosts for this loop 30582 1726855381.01955: done getting the remaining hosts for this loop 30582 1726855381.01958: getting the next task for host managed_node3 30582 1726855381.01967: done getting next task for host managed_node3 30582 1726855381.01970: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855381.01974: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.01986: getting variables 30582 1726855381.01989: in VariableManager get_vars() 30582 1726855381.02023: Calling all_inventory to load vars for managed_node3 30582 1726855381.02026: Calling groups_inventory to load vars for managed_node3 30582 1726855381.02028: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.02036: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.02038: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.02040: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.02803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.03776: done with get_vars() 30582 1726855381.03794: done getting variables 30582 1726855381.03839: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:03:01 -0400 (0:00:00.037) 0:01:57.388 ****** 30582 1726855381.03870: entering _queue_task() for managed_node3/debug 30582 1726855381.04115: worker is 1 (out of 1 available) 30582 1726855381.04128: exiting _queue_task() for managed_node3/debug 30582 1726855381.04140: done queuing things up, now waiting for results queue to drain 30582 1726855381.04142: waiting for pending results... 30582 1726855381.04324: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855381.04414: in run() - task 0affcc66-ac2b-aa83-7d57-00000000233b 30582 1726855381.04427: variable 'ansible_search_path' from source: unknown 30582 1726855381.04431: variable 'ansible_search_path' from source: unknown 30582 1726855381.04459: calling self._execute() 30582 1726855381.04534: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.04538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.04546: variable 'omit' from source: magic vars 30582 1726855381.04820: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.04830: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.04836: variable 'omit' from source: magic vars 30582 1726855381.04886: variable 'omit' from source: magic vars 30582 1726855381.04914: variable 'omit' from source: magic vars 30582 1726855381.04947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855381.04974: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855381.04991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855381.05005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855381.05019: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855381.05042: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855381.05045: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.05048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.05123: Set connection var ansible_timeout to 10 30582 1726855381.05126: Set connection var ansible_connection to ssh 30582 1726855381.05129: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855381.05138: Set connection var ansible_pipelining to False 30582 1726855381.05141: Set connection var ansible_shell_executable to /bin/sh 30582 1726855381.05143: Set connection var ansible_shell_type to sh 30582 1726855381.05159: variable 'ansible_shell_executable' from source: unknown 30582 1726855381.05162: variable 'ansible_connection' from source: unknown 30582 1726855381.05167: variable 'ansible_module_compression' from source: unknown 30582 1726855381.05170: variable 'ansible_shell_type' from source: unknown 30582 1726855381.05172: variable 'ansible_shell_executable' from source: unknown 30582 1726855381.05174: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.05176: variable 'ansible_pipelining' from source: unknown 30582 1726855381.05178: variable 'ansible_timeout' from source: unknown 30582 1726855381.05180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.05281: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855381.05292: variable 'omit' from source: magic vars 30582 1726855381.05297: starting attempt loop 30582 1726855381.05300: running the handler 30582 1726855381.05337: variable '__network_connections_result' from source: set_fact 30582 1726855381.05398: variable '__network_connections_result' from source: set_fact 30582 1726855381.05475: handler run complete 30582 1726855381.05494: attempt loop complete, returning result 30582 1726855381.05497: _execute() done 30582 1726855381.05499: dumping result to json 30582 1726855381.05502: done dumping result, returning 30582 1726855381.05511: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-00000000233b] 30582 1726855381.05515: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000233b 30582 1726855381.05608: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000233b 30582 1726855381.05611: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 02f79b0a-2569-4459-9e63-b8baa27c9d76 skipped because already active" ] } } 30582 1726855381.05707: no more pending results, returning what we have 30582 1726855381.05711: results queue empty 30582 1726855381.05712: checking for any_errors_fatal 30582 1726855381.05721: done checking for any_errors_fatal 30582 1726855381.05722: checking for max_fail_percentage 30582 1726855381.05724: done checking for max_fail_percentage 30582 1726855381.05725: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.05725: done checking to see if all hosts have failed 30582 1726855381.05726: getting the remaining hosts for this loop 30582 1726855381.05728: done getting the remaining hosts for this loop 30582 1726855381.05731: getting the next task for host managed_node3 30582 1726855381.05737: done getting next task for host managed_node3 30582 1726855381.05741: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855381.05745: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.05757: getting variables 30582 1726855381.05758: in VariableManager get_vars() 30582 1726855381.05803: Calling all_inventory to load vars for managed_node3 30582 1726855381.05805: Calling groups_inventory to load vars for managed_node3 30582 1726855381.05812: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.05820: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.05823: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.05825: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.06600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.07460: done with get_vars() 30582 1726855381.07481: done getting variables 30582 1726855381.07528: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:03:01 -0400 (0:00:00.036) 0:01:57.425 ****** 30582 1726855381.07554: entering _queue_task() for managed_node3/debug 30582 1726855381.07813: worker is 1 (out of 1 available) 30582 1726855381.07827: exiting _queue_task() for managed_node3/debug 30582 1726855381.07838: done queuing things up, now waiting for results queue to drain 30582 1726855381.07839: waiting for pending results... 30582 1726855381.08033: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855381.08130: in run() - task 0affcc66-ac2b-aa83-7d57-00000000233c 30582 1726855381.08145: variable 'ansible_search_path' from source: unknown 30582 1726855381.08149: variable 'ansible_search_path' from source: unknown 30582 1726855381.08182: calling self._execute() 30582 1726855381.08266: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.08273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.08286: variable 'omit' from source: magic vars 30582 1726855381.08566: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.08578: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.08665: variable 'network_state' from source: role '' defaults 30582 1726855381.08677: Evaluated conditional (network_state != {}): False 30582 1726855381.08680: when evaluation is False, skipping this task 30582 1726855381.08683: _execute() done 30582 1726855381.08685: dumping result to json 30582 1726855381.08690: done dumping result, returning 30582 1726855381.08697: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-00000000233c] 30582 1726855381.08702: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000233c 30582 1726855381.08790: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000233c 30582 1726855381.08792: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855381.08867: no more pending results, returning what we have 30582 1726855381.08870: results queue empty 30582 1726855381.08871: checking for any_errors_fatal 30582 1726855381.08879: done checking for any_errors_fatal 30582 1726855381.08880: checking for max_fail_percentage 30582 1726855381.08882: done checking for max_fail_percentage 30582 1726855381.08883: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.08884: done checking to see if all hosts have failed 30582 1726855381.08884: getting the remaining hosts for this loop 30582 1726855381.08886: done getting the remaining hosts for this loop 30582 1726855381.08891: getting the next task for host managed_node3 30582 1726855381.08899: done getting next task for host managed_node3 30582 1726855381.08903: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855381.08907: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.08929: getting variables 30582 1726855381.08930: in VariableManager get_vars() 30582 1726855381.08968: Calling all_inventory to load vars for managed_node3 30582 1726855381.08971: Calling groups_inventory to load vars for managed_node3 30582 1726855381.08973: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.08981: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.08984: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.08986: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.09902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.15193: done with get_vars() 30582 1726855381.15217: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:03:01 -0400 (0:00:00.077) 0:01:57.502 ****** 30582 1726855381.15278: entering _queue_task() for managed_node3/ping 30582 1726855381.15556: worker is 1 (out of 1 available) 30582 1726855381.15570: exiting _queue_task() for managed_node3/ping 30582 1726855381.15582: done queuing things up, now waiting for results queue to drain 30582 1726855381.15584: waiting for pending results... 30582 1726855381.15781: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855381.15901: in run() - task 0affcc66-ac2b-aa83-7d57-00000000233d 30582 1726855381.15911: variable 'ansible_search_path' from source: unknown 30582 1726855381.15917: variable 'ansible_search_path' from source: unknown 30582 1726855381.15949: calling self._execute() 30582 1726855381.16024: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.16030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.16039: variable 'omit' from source: magic vars 30582 1726855381.16333: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.16343: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.16349: variable 'omit' from source: magic vars 30582 1726855381.16398: variable 'omit' from source: magic vars 30582 1726855381.16422: variable 'omit' from source: magic vars 30582 1726855381.16454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855381.16488: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855381.16506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855381.16520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855381.16530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855381.16555: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855381.16558: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.16560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.16636: Set connection var ansible_timeout to 10 30582 1726855381.16639: Set connection var ansible_connection to ssh 30582 1726855381.16643: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855381.16649: Set connection var ansible_pipelining to False 30582 1726855381.16654: Set connection var ansible_shell_executable to /bin/sh 30582 1726855381.16657: Set connection var ansible_shell_type to sh 30582 1726855381.16676: variable 'ansible_shell_executable' from source: unknown 30582 1726855381.16680: variable 'ansible_connection' from source: unknown 30582 1726855381.16684: variable 'ansible_module_compression' from source: unknown 30582 1726855381.16688: variable 'ansible_shell_type' from source: unknown 30582 1726855381.16692: variable 'ansible_shell_executable' from source: unknown 30582 1726855381.16695: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.16698: variable 'ansible_pipelining' from source: unknown 30582 1726855381.16700: variable 'ansible_timeout' from source: unknown 30582 1726855381.16703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.16847: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855381.16856: variable 'omit' from source: magic vars 30582 1726855381.16861: starting attempt loop 30582 1726855381.16863: running the handler 30582 1726855381.16879: _low_level_execute_command(): starting 30582 1726855381.16885: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855381.17379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855381.17409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855381.17414: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.17471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855381.17474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855381.17479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.17544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.19250: stdout chunk (state=3): >>>/root <<< 30582 1726855381.19352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855381.19384: stderr chunk (state=3): >>><<< 30582 1726855381.19390: stdout chunk (state=3): >>><<< 30582 1726855381.19414: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855381.19425: _low_level_execute_command(): starting 30582 1726855381.19432: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676 `" && echo ansible-tmp-1726855381.1941214-35896-90487244727676="` echo /root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676 `" ) && sleep 0' 30582 1726855381.19871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855381.19875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.19878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.19890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855381.19893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.19932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855381.19936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.20002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.21886: stdout chunk (state=3): >>>ansible-tmp-1726855381.1941214-35896-90487244727676=/root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676 <<< 30582 1726855381.21996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855381.22021: stderr chunk (state=3): >>><<< 30582 1726855381.22024: stdout chunk (state=3): >>><<< 30582 1726855381.22044: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855381.1941214-35896-90487244727676=/root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855381.22083: variable 'ansible_module_compression' from source: unknown 30582 1726855381.22118: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855381.22148: variable 'ansible_facts' from source: unknown 30582 1726855381.22205: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/AnsiballZ_ping.py 30582 1726855381.22306: Sending initial data 30582 1726855381.22309: Sent initial data (152 bytes) 30582 1726855381.22748: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.22752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855381.22754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.22756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855381.22758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.22812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855381.22819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.22876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.24432: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855381.24436: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855381.24488: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855381.24549: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp9d7srf0r /root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/AnsiballZ_ping.py <<< 30582 1726855381.24552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/AnsiballZ_ping.py" <<< 30582 1726855381.24605: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp9d7srf0r" to remote "/root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/AnsiballZ_ping.py" <<< 30582 1726855381.24612: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/AnsiballZ_ping.py" <<< 30582 1726855381.25190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855381.25234: stderr chunk (state=3): >>><<< 30582 1726855381.25237: stdout chunk (state=3): >>><<< 30582 1726855381.25285: done transferring module to remote 30582 1726855381.25293: _low_level_execute_command(): starting 30582 1726855381.25298: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/ /root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/AnsiballZ_ping.py && sleep 0' 30582 1726855381.25747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.25750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.25753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855381.25755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855381.25757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.25805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855381.25808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.25878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.27628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855381.27654: stderr chunk (state=3): >>><<< 30582 1726855381.27657: stdout chunk (state=3): >>><<< 30582 1726855381.27677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855381.27682: _low_level_execute_command(): starting 30582 1726855381.27684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/AnsiballZ_ping.py && sleep 0' 30582 1726855381.28137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.28140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.28143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855381.28145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855381.28147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.28199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855381.28206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855381.28208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.28270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.43261: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855381.44553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855381.44583: stderr chunk (state=3): >>><<< 30582 1726855381.44586: stdout chunk (state=3): >>><<< 30582 1726855381.44606: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855381.44631: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855381.44639: _low_level_execute_command(): starting 30582 1726855381.44644: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855381.1941214-35896-90487244727676/ > /dev/null 2>&1 && sleep 0' 30582 1726855381.45130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.45134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.45136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.45141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855381.45143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.45193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855381.45196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855381.45205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.45268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.47118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855381.47145: stderr chunk (state=3): >>><<< 30582 1726855381.47148: stdout chunk (state=3): >>><<< 30582 1726855381.47164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855381.47172: handler run complete 30582 1726855381.47190: attempt loop complete, returning result 30582 1726855381.47193: _execute() done 30582 1726855381.47196: dumping result to json 30582 1726855381.47200: done dumping result, returning 30582 1726855381.47208: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-00000000233d] 30582 1726855381.47213: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000233d 30582 1726855381.47304: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000233d 30582 1726855381.47307: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855381.47375: no more pending results, returning what we have 30582 1726855381.47378: results queue empty 30582 1726855381.47379: checking for any_errors_fatal 30582 1726855381.47390: done checking for any_errors_fatal 30582 1726855381.47390: checking for max_fail_percentage 30582 1726855381.47392: done checking for max_fail_percentage 30582 1726855381.47393: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.47394: done checking to see if all hosts have failed 30582 1726855381.47395: getting the remaining hosts for this loop 30582 1726855381.47396: done getting the remaining hosts for this loop 30582 1726855381.47400: getting the next task for host managed_node3 30582 1726855381.47411: done getting next task for host managed_node3 30582 1726855381.47413: ^ task is: TASK: meta (role_complete) 30582 1726855381.47418: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.47434: getting variables 30582 1726855381.47436: in VariableManager get_vars() 30582 1726855381.47484: Calling all_inventory to load vars for managed_node3 30582 1726855381.47488: Calling groups_inventory to load vars for managed_node3 30582 1726855381.47494: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.47504: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.47507: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.47509: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.48356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.49237: done with get_vars() 30582 1726855381.49255: done getting variables 30582 1726855381.49317: done queuing things up, now waiting for results queue to drain 30582 1726855381.49319: results queue empty 30582 1726855381.49319: checking for any_errors_fatal 30582 1726855381.49321: done checking for any_errors_fatal 30582 1726855381.49321: checking for max_fail_percentage 30582 1726855381.49322: done checking for max_fail_percentage 30582 1726855381.49323: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.49323: done checking to see if all hosts have failed 30582 1726855381.49323: getting the remaining hosts for this loop 30582 1726855381.49324: done getting the remaining hosts for this loop 30582 1726855381.49326: getting the next task for host managed_node3 30582 1726855381.49332: done getting next task for host managed_node3 30582 1726855381.49333: ^ task is: TASK: Include network role 30582 1726855381.49335: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.49337: getting variables 30582 1726855381.49338: in VariableManager get_vars() 30582 1726855381.49347: Calling all_inventory to load vars for managed_node3 30582 1726855381.49348: Calling groups_inventory to load vars for managed_node3 30582 1726855381.49350: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.49353: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.49355: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.49356: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.50128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.51004: done with get_vars() 30582 1726855381.51023: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 14:03:01 -0400 (0:00:00.358) 0:01:57.860 ****** 30582 1726855381.51082: entering _queue_task() for managed_node3/include_role 30582 1726855381.51408: worker is 1 (out of 1 available) 30582 1726855381.51422: exiting _queue_task() for managed_node3/include_role 30582 1726855381.51433: done queuing things up, now waiting for results queue to drain 30582 1726855381.51434: waiting for pending results... 30582 1726855381.51630: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855381.51738: in run() - task 0affcc66-ac2b-aa83-7d57-000000002142 30582 1726855381.51750: variable 'ansible_search_path' from source: unknown 30582 1726855381.51753: variable 'ansible_search_path' from source: unknown 30582 1726855381.51786: calling self._execute() 30582 1726855381.51861: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.51873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.51882: variable 'omit' from source: magic vars 30582 1726855381.52174: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.52184: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.52191: _execute() done 30582 1726855381.52195: dumping result to json 30582 1726855381.52199: done dumping result, returning 30582 1726855381.52210: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-000000002142] 30582 1726855381.52213: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002142 30582 1726855381.52322: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002142 30582 1726855381.52325: WORKER PROCESS EXITING 30582 1726855381.52353: no more pending results, returning what we have 30582 1726855381.52358: in VariableManager get_vars() 30582 1726855381.52410: Calling all_inventory to load vars for managed_node3 30582 1726855381.52413: Calling groups_inventory to load vars for managed_node3 30582 1726855381.52416: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.52429: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.52432: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.52434: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.53259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.54135: done with get_vars() 30582 1726855381.54150: variable 'ansible_search_path' from source: unknown 30582 1726855381.54151: variable 'ansible_search_path' from source: unknown 30582 1726855381.54248: variable 'omit' from source: magic vars 30582 1726855381.54278: variable 'omit' from source: magic vars 30582 1726855381.54289: variable 'omit' from source: magic vars 30582 1726855381.54292: we have included files to process 30582 1726855381.54293: generating all_blocks data 30582 1726855381.54295: done generating all_blocks data 30582 1726855381.54298: processing included file: fedora.linux_system_roles.network 30582 1726855381.54311: in VariableManager get_vars() 30582 1726855381.54324: done with get_vars() 30582 1726855381.54344: in VariableManager get_vars() 30582 1726855381.54355: done with get_vars() 30582 1726855381.54384: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855381.54458: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855381.54508: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855381.54822: in VariableManager get_vars() 30582 1726855381.54837: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855381.56061: iterating over new_blocks loaded from include file 30582 1726855381.56065: in VariableManager get_vars() 30582 1726855381.56078: done with get_vars() 30582 1726855381.56080: filtering new block on tags 30582 1726855381.56239: done filtering new block on tags 30582 1726855381.56242: in VariableManager get_vars() 30582 1726855381.56253: done with get_vars() 30582 1726855381.56255: filtering new block on tags 30582 1726855381.56267: done filtering new block on tags 30582 1726855381.56268: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855381.56272: extending task lists for all hosts with included blocks 30582 1726855381.56338: done extending task lists 30582 1726855381.56339: done processing included files 30582 1726855381.56340: results queue empty 30582 1726855381.56340: checking for any_errors_fatal 30582 1726855381.56341: done checking for any_errors_fatal 30582 1726855381.56342: checking for max_fail_percentage 30582 1726855381.56343: done checking for max_fail_percentage 30582 1726855381.56343: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.56344: done checking to see if all hosts have failed 30582 1726855381.56344: getting the remaining hosts for this loop 30582 1726855381.56345: done getting the remaining hosts for this loop 30582 1726855381.56347: getting the next task for host managed_node3 30582 1726855381.56350: done getting next task for host managed_node3 30582 1726855381.56351: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855381.56354: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.56361: getting variables 30582 1726855381.56362: in VariableManager get_vars() 30582 1726855381.56375: Calling all_inventory to load vars for managed_node3 30582 1726855381.56377: Calling groups_inventory to load vars for managed_node3 30582 1726855381.56378: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.56382: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.56384: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.56388: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.57061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.57933: done with get_vars() 30582 1726855381.57952: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:03:01 -0400 (0:00:00.069) 0:01:57.930 ****** 30582 1726855381.58013: entering _queue_task() for managed_node3/include_tasks 30582 1726855381.58296: worker is 1 (out of 1 available) 30582 1726855381.58310: exiting _queue_task() for managed_node3/include_tasks 30582 1726855381.58323: done queuing things up, now waiting for results queue to drain 30582 1726855381.58325: waiting for pending results... 30582 1726855381.58514: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855381.58625: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024a4 30582 1726855381.58637: variable 'ansible_search_path' from source: unknown 30582 1726855381.58640: variable 'ansible_search_path' from source: unknown 30582 1726855381.58672: calling self._execute() 30582 1726855381.58746: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.58750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.58757: variable 'omit' from source: magic vars 30582 1726855381.59042: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.59053: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.59057: _execute() done 30582 1726855381.59061: dumping result to json 30582 1726855381.59066: done dumping result, returning 30582 1726855381.59072: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-0000000024a4] 30582 1726855381.59077: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a4 30582 1726855381.59172: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a4 30582 1726855381.59175: WORKER PROCESS EXITING 30582 1726855381.59251: no more pending results, returning what we have 30582 1726855381.59256: in VariableManager get_vars() 30582 1726855381.59312: Calling all_inventory to load vars for managed_node3 30582 1726855381.59315: Calling groups_inventory to load vars for managed_node3 30582 1726855381.59318: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.59330: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.59332: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.59335: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.60268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.61154: done with get_vars() 30582 1726855381.61173: variable 'ansible_search_path' from source: unknown 30582 1726855381.61174: variable 'ansible_search_path' from source: unknown 30582 1726855381.61204: we have included files to process 30582 1726855381.61205: generating all_blocks data 30582 1726855381.61206: done generating all_blocks data 30582 1726855381.61209: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855381.61210: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855381.61211: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855381.61596: done processing included file 30582 1726855381.61598: iterating over new_blocks loaded from include file 30582 1726855381.61599: in VariableManager get_vars() 30582 1726855381.61617: done with get_vars() 30582 1726855381.61618: filtering new block on tags 30582 1726855381.61638: done filtering new block on tags 30582 1726855381.61640: in VariableManager get_vars() 30582 1726855381.61654: done with get_vars() 30582 1726855381.61655: filtering new block on tags 30582 1726855381.61685: done filtering new block on tags 30582 1726855381.61688: in VariableManager get_vars() 30582 1726855381.61704: done with get_vars() 30582 1726855381.61706: filtering new block on tags 30582 1726855381.61730: done filtering new block on tags 30582 1726855381.61732: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855381.61735: extending task lists for all hosts with included blocks 30582 1726855381.62699: done extending task lists 30582 1726855381.62700: done processing included files 30582 1726855381.62700: results queue empty 30582 1726855381.62701: checking for any_errors_fatal 30582 1726855381.62703: done checking for any_errors_fatal 30582 1726855381.62704: checking for max_fail_percentage 30582 1726855381.62704: done checking for max_fail_percentage 30582 1726855381.62705: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.62706: done checking to see if all hosts have failed 30582 1726855381.62706: getting the remaining hosts for this loop 30582 1726855381.62707: done getting the remaining hosts for this loop 30582 1726855381.62708: getting the next task for host managed_node3 30582 1726855381.62712: done getting next task for host managed_node3 30582 1726855381.62714: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855381.62716: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.62726: getting variables 30582 1726855381.62726: in VariableManager get_vars() 30582 1726855381.62737: Calling all_inventory to load vars for managed_node3 30582 1726855381.62738: Calling groups_inventory to load vars for managed_node3 30582 1726855381.62740: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.62743: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.62745: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.62747: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.63381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.64324: done with get_vars() 30582 1726855381.64342: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:03:01 -0400 (0:00:00.063) 0:01:57.993 ****** 30582 1726855381.64406: entering _queue_task() for managed_node3/setup 30582 1726855381.64693: worker is 1 (out of 1 available) 30582 1726855381.64706: exiting _queue_task() for managed_node3/setup 30582 1726855381.64717: done queuing things up, now waiting for results queue to drain 30582 1726855381.64719: waiting for pending results... 30582 1726855381.64910: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855381.65018: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024fb 30582 1726855381.65029: variable 'ansible_search_path' from source: unknown 30582 1726855381.65033: variable 'ansible_search_path' from source: unknown 30582 1726855381.65067: calling self._execute() 30582 1726855381.65140: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.65144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.65153: variable 'omit' from source: magic vars 30582 1726855381.65443: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.65453: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.65606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855381.67115: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855381.67162: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855381.67193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855381.67219: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855381.67240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855381.67302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855381.67323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855381.67341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855381.67369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855381.67379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855381.67418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855381.67434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855381.67454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855381.67481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855381.67492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855381.67602: variable '__network_required_facts' from source: role '' defaults 30582 1726855381.67610: variable 'ansible_facts' from source: unknown 30582 1726855381.68070: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855381.68074: when evaluation is False, skipping this task 30582 1726855381.68076: _execute() done 30582 1726855381.68079: dumping result to json 30582 1726855381.68081: done dumping result, returning 30582 1726855381.68095: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-0000000024fb] 30582 1726855381.68099: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024fb 30582 1726855381.68191: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024fb 30582 1726855381.68194: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855381.68250: no more pending results, returning what we have 30582 1726855381.68254: results queue empty 30582 1726855381.68256: checking for any_errors_fatal 30582 1726855381.68257: done checking for any_errors_fatal 30582 1726855381.68258: checking for max_fail_percentage 30582 1726855381.68260: done checking for max_fail_percentage 30582 1726855381.68261: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.68261: done checking to see if all hosts have failed 30582 1726855381.68262: getting the remaining hosts for this loop 30582 1726855381.68266: done getting the remaining hosts for this loop 30582 1726855381.68269: getting the next task for host managed_node3 30582 1726855381.68281: done getting next task for host managed_node3 30582 1726855381.68285: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855381.68292: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.68322: getting variables 30582 1726855381.68324: in VariableManager get_vars() 30582 1726855381.68369: Calling all_inventory to load vars for managed_node3 30582 1726855381.68373: Calling groups_inventory to load vars for managed_node3 30582 1726855381.68376: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.68386: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.68393: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.68402: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.69219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.70099: done with get_vars() 30582 1726855381.70118: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:03:01 -0400 (0:00:00.057) 0:01:58.051 ****** 30582 1726855381.70192: entering _queue_task() for managed_node3/stat 30582 1726855381.70451: worker is 1 (out of 1 available) 30582 1726855381.70468: exiting _queue_task() for managed_node3/stat 30582 1726855381.70480: done queuing things up, now waiting for results queue to drain 30582 1726855381.70482: waiting for pending results... 30582 1726855381.70675: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855381.70765: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024fd 30582 1726855381.70776: variable 'ansible_search_path' from source: unknown 30582 1726855381.70779: variable 'ansible_search_path' from source: unknown 30582 1726855381.70811: calling self._execute() 30582 1726855381.70884: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.70890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.70898: variable 'omit' from source: magic vars 30582 1726855381.71184: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.71194: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.71309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855381.71506: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855381.71537: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855381.71561: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855381.71590: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855381.71653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855381.71671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855381.71694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855381.71712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855381.71778: variable '__network_is_ostree' from source: set_fact 30582 1726855381.71785: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855381.71790: when evaluation is False, skipping this task 30582 1726855381.71793: _execute() done 30582 1726855381.71795: dumping result to json 30582 1726855381.71797: done dumping result, returning 30582 1726855381.71805: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-0000000024fd] 30582 1726855381.71810: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024fd 30582 1726855381.71898: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024fd 30582 1726855381.71901: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855381.71952: no more pending results, returning what we have 30582 1726855381.71955: results queue empty 30582 1726855381.71956: checking for any_errors_fatal 30582 1726855381.71968: done checking for any_errors_fatal 30582 1726855381.71969: checking for max_fail_percentage 30582 1726855381.71971: done checking for max_fail_percentage 30582 1726855381.71972: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.71973: done checking to see if all hosts have failed 30582 1726855381.71973: getting the remaining hosts for this loop 30582 1726855381.71975: done getting the remaining hosts for this loop 30582 1726855381.71979: getting the next task for host managed_node3 30582 1726855381.71990: done getting next task for host managed_node3 30582 1726855381.71993: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855381.71999: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.72027: getting variables 30582 1726855381.72029: in VariableManager get_vars() 30582 1726855381.72073: Calling all_inventory to load vars for managed_node3 30582 1726855381.72076: Calling groups_inventory to load vars for managed_node3 30582 1726855381.72078: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.72093: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.72097: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.72100: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.73079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.73945: done with get_vars() 30582 1726855381.73962: done getting variables 30582 1726855381.74009: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:03:01 -0400 (0:00:00.038) 0:01:58.090 ****** 30582 1726855381.74038: entering _queue_task() for managed_node3/set_fact 30582 1726855381.74304: worker is 1 (out of 1 available) 30582 1726855381.74318: exiting _queue_task() for managed_node3/set_fact 30582 1726855381.74330: done queuing things up, now waiting for results queue to drain 30582 1726855381.74332: waiting for pending results... 30582 1726855381.74521: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855381.74642: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024fe 30582 1726855381.74655: variable 'ansible_search_path' from source: unknown 30582 1726855381.74659: variable 'ansible_search_path' from source: unknown 30582 1726855381.74692: calling self._execute() 30582 1726855381.74758: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.74762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.74771: variable 'omit' from source: magic vars 30582 1726855381.75046: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.75056: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.75174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855381.75369: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855381.75401: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855381.75432: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855381.75455: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855381.75520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855381.75540: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855381.75557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855381.75576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855381.75649: variable '__network_is_ostree' from source: set_fact 30582 1726855381.75653: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855381.75655: when evaluation is False, skipping this task 30582 1726855381.75658: _execute() done 30582 1726855381.75660: dumping result to json 30582 1726855381.75662: done dumping result, returning 30582 1726855381.75672: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-0000000024fe] 30582 1726855381.75676: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024fe 30582 1726855381.75756: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024fe 30582 1726855381.75759: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855381.75810: no more pending results, returning what we have 30582 1726855381.75813: results queue empty 30582 1726855381.75814: checking for any_errors_fatal 30582 1726855381.75820: done checking for any_errors_fatal 30582 1726855381.75821: checking for max_fail_percentage 30582 1726855381.75822: done checking for max_fail_percentage 30582 1726855381.75823: checking to see if all hosts have failed and the running result is not ok 30582 1726855381.75824: done checking to see if all hosts have failed 30582 1726855381.75825: getting the remaining hosts for this loop 30582 1726855381.75827: done getting the remaining hosts for this loop 30582 1726855381.75830: getting the next task for host managed_node3 30582 1726855381.75841: done getting next task for host managed_node3 30582 1726855381.75844: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855381.75850: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855381.75884: getting variables 30582 1726855381.75886: in VariableManager get_vars() 30582 1726855381.75933: Calling all_inventory to load vars for managed_node3 30582 1726855381.75935: Calling groups_inventory to load vars for managed_node3 30582 1726855381.75937: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855381.75947: Calling all_plugins_play to load vars for managed_node3 30582 1726855381.75950: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855381.75952: Calling groups_plugins_play to load vars for managed_node3 30582 1726855381.76755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855381.77636: done with get_vars() 30582 1726855381.77653: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:03:01 -0400 (0:00:00.036) 0:01:58.127 ****** 30582 1726855381.77726: entering _queue_task() for managed_node3/service_facts 30582 1726855381.77984: worker is 1 (out of 1 available) 30582 1726855381.78000: exiting _queue_task() for managed_node3/service_facts 30582 1726855381.78013: done queuing things up, now waiting for results queue to drain 30582 1726855381.78015: waiting for pending results... 30582 1726855381.78199: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855381.78305: in run() - task 0affcc66-ac2b-aa83-7d57-000000002500 30582 1726855381.78316: variable 'ansible_search_path' from source: unknown 30582 1726855381.78320: variable 'ansible_search_path' from source: unknown 30582 1726855381.78352: calling self._execute() 30582 1726855381.78419: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.78422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.78431: variable 'omit' from source: magic vars 30582 1726855381.78725: variable 'ansible_distribution_major_version' from source: facts 30582 1726855381.78735: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855381.78741: variable 'omit' from source: magic vars 30582 1726855381.78794: variable 'omit' from source: magic vars 30582 1726855381.78818: variable 'omit' from source: magic vars 30582 1726855381.78848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855381.78879: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855381.78898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855381.78912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855381.78922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855381.78947: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855381.78951: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.78953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.79029: Set connection var ansible_timeout to 10 30582 1726855381.79032: Set connection var ansible_connection to ssh 30582 1726855381.79037: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855381.79042: Set connection var ansible_pipelining to False 30582 1726855381.79047: Set connection var ansible_shell_executable to /bin/sh 30582 1726855381.79050: Set connection var ansible_shell_type to sh 30582 1726855381.79070: variable 'ansible_shell_executable' from source: unknown 30582 1726855381.79073: variable 'ansible_connection' from source: unknown 30582 1726855381.79076: variable 'ansible_module_compression' from source: unknown 30582 1726855381.79078: variable 'ansible_shell_type' from source: unknown 30582 1726855381.79080: variable 'ansible_shell_executable' from source: unknown 30582 1726855381.79082: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855381.79084: variable 'ansible_pipelining' from source: unknown 30582 1726855381.79089: variable 'ansible_timeout' from source: unknown 30582 1726855381.79093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855381.79236: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855381.79245: variable 'omit' from source: magic vars 30582 1726855381.79250: starting attempt loop 30582 1726855381.79253: running the handler 30582 1726855381.79267: _low_level_execute_command(): starting 30582 1726855381.79275: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855381.79764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855381.79796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.79800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855381.79802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.79855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855381.79858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855381.79860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.79935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.81632: stdout chunk (state=3): >>>/root <<< 30582 1726855381.81731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855381.81758: stderr chunk (state=3): >>><<< 30582 1726855381.81761: stdout chunk (state=3): >>><<< 30582 1726855381.81779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855381.81823: _low_level_execute_command(): starting 30582 1726855381.81827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588 `" && echo ansible-tmp-1726855381.8177855-35909-272240216781588="` echo /root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588 `" ) && sleep 0' 30582 1726855381.82232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.82236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855381.82238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855381.82248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855381.82253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.82293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855381.82304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.82367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.84308: stdout chunk (state=3): >>>ansible-tmp-1726855381.8177855-35909-272240216781588=/root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588 <<< 30582 1726855381.84407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855381.84432: stderr chunk (state=3): >>><<< 30582 1726855381.84436: stdout chunk (state=3): >>><<< 30582 1726855381.84451: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855381.8177855-35909-272240216781588=/root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855381.84498: variable 'ansible_module_compression' from source: unknown 30582 1726855381.84533: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855381.84564: variable 'ansible_facts' from source: unknown 30582 1726855381.84626: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/AnsiballZ_service_facts.py 30582 1726855381.84730: Sending initial data 30582 1726855381.84733: Sent initial data (162 bytes) 30582 1726855381.85168: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.85171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855381.85174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.85176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855381.85178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855381.85180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.85235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855381.85241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855381.85244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.85298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.86864: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855381.86914: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855381.86977: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp1or8765j /root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/AnsiballZ_service_facts.py <<< 30582 1726855381.86980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/AnsiballZ_service_facts.py" <<< 30582 1726855381.87037: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp1or8765j" to remote "/root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/AnsiballZ_service_facts.py" <<< 30582 1726855381.87043: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/AnsiballZ_service_facts.py" <<< 30582 1726855381.87653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855381.87697: stderr chunk (state=3): >>><<< 30582 1726855381.87700: stdout chunk (state=3): >>><<< 30582 1726855381.87750: done transferring module to remote 30582 1726855381.87758: _low_level_execute_command(): starting 30582 1726855381.87763: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/ /root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/AnsiballZ_service_facts.py && sleep 0' 30582 1726855381.88214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.88218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855381.88220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.88223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855381.88229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855381.88231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.88268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855381.88281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.88342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855381.90120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855381.90144: stderr chunk (state=3): >>><<< 30582 1726855381.90147: stdout chunk (state=3): >>><<< 30582 1726855381.90159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855381.90161: _low_level_execute_command(): starting 30582 1726855381.90169: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/AnsiballZ_service_facts.py && sleep 0' 30582 1726855381.90604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855381.90609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.90612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855381.90614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855381.90616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855381.90673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855381.90675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855381.90677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855381.90735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855383.41847: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30582 1726855383.41882: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30582 1726855383.41889: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 30582 1726855383.41925: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30582 1726855383.41931: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855383.43438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855383.43471: stderr chunk (state=3): >>><<< 30582 1726855383.43474: stdout chunk (state=3): >>><<< 30582 1726855383.43510: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855383.44250: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855383.44258: _low_level_execute_command(): starting 30582 1726855383.44262: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855381.8177855-35909-272240216781588/ > /dev/null 2>&1 && sleep 0' 30582 1726855383.44716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855383.44719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855383.44723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.44725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855383.44727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.44776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855383.44780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855383.44847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855383.46660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855383.46692: stderr chunk (state=3): >>><<< 30582 1726855383.46695: stdout chunk (state=3): >>><<< 30582 1726855383.46708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855383.46713: handler run complete 30582 1726855383.46827: variable 'ansible_facts' from source: unknown 30582 1726855383.46919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855383.47194: variable 'ansible_facts' from source: unknown 30582 1726855383.47272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855383.47383: attempt loop complete, returning result 30582 1726855383.47386: _execute() done 30582 1726855383.47391: dumping result to json 30582 1726855383.47429: done dumping result, returning 30582 1726855383.47437: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-000000002500] 30582 1726855383.47442: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002500 30582 1726855383.48236: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002500 30582 1726855383.48239: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855383.48298: no more pending results, returning what we have 30582 1726855383.48300: results queue empty 30582 1726855383.48300: checking for any_errors_fatal 30582 1726855383.48302: done checking for any_errors_fatal 30582 1726855383.48303: checking for max_fail_percentage 30582 1726855383.48304: done checking for max_fail_percentage 30582 1726855383.48304: checking to see if all hosts have failed and the running result is not ok 30582 1726855383.48305: done checking to see if all hosts have failed 30582 1726855383.48305: getting the remaining hosts for this loop 30582 1726855383.48306: done getting the remaining hosts for this loop 30582 1726855383.48309: getting the next task for host managed_node3 30582 1726855383.48313: done getting next task for host managed_node3 30582 1726855383.48315: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855383.48320: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855383.48328: getting variables 30582 1726855383.48329: in VariableManager get_vars() 30582 1726855383.48355: Calling all_inventory to load vars for managed_node3 30582 1726855383.48357: Calling groups_inventory to load vars for managed_node3 30582 1726855383.48358: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855383.48365: Calling all_plugins_play to load vars for managed_node3 30582 1726855383.48367: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855383.48374: Calling groups_plugins_play to load vars for managed_node3 30582 1726855383.49050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855383.49931: done with get_vars() 30582 1726855383.49948: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:03:03 -0400 (0:00:01.722) 0:01:59.850 ****** 30582 1726855383.50022: entering _queue_task() for managed_node3/package_facts 30582 1726855383.50270: worker is 1 (out of 1 available) 30582 1726855383.50285: exiting _queue_task() for managed_node3/package_facts 30582 1726855383.50300: done queuing things up, now waiting for results queue to drain 30582 1726855383.50302: waiting for pending results... 30582 1726855383.50495: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855383.50590: in run() - task 0affcc66-ac2b-aa83-7d57-000000002501 30582 1726855383.50602: variable 'ansible_search_path' from source: unknown 30582 1726855383.50606: variable 'ansible_search_path' from source: unknown 30582 1726855383.50638: calling self._execute() 30582 1726855383.50715: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855383.50719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855383.50728: variable 'omit' from source: magic vars 30582 1726855383.51013: variable 'ansible_distribution_major_version' from source: facts 30582 1726855383.51022: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855383.51028: variable 'omit' from source: magic vars 30582 1726855383.51081: variable 'omit' from source: magic vars 30582 1726855383.51104: variable 'omit' from source: magic vars 30582 1726855383.51135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855383.51161: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855383.51181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855383.51196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855383.51207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855383.51241: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855383.51244: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855383.51246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855383.51325: Set connection var ansible_timeout to 10 30582 1726855383.51328: Set connection var ansible_connection to ssh 30582 1726855383.51333: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855383.51338: Set connection var ansible_pipelining to False 30582 1726855383.51343: Set connection var ansible_shell_executable to /bin/sh 30582 1726855383.51346: Set connection var ansible_shell_type to sh 30582 1726855383.51362: variable 'ansible_shell_executable' from source: unknown 30582 1726855383.51365: variable 'ansible_connection' from source: unknown 30582 1726855383.51371: variable 'ansible_module_compression' from source: unknown 30582 1726855383.51374: variable 'ansible_shell_type' from source: unknown 30582 1726855383.51376: variable 'ansible_shell_executable' from source: unknown 30582 1726855383.51379: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855383.51382: variable 'ansible_pipelining' from source: unknown 30582 1726855383.51385: variable 'ansible_timeout' from source: unknown 30582 1726855383.51392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855383.51533: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855383.51542: variable 'omit' from source: magic vars 30582 1726855383.51546: starting attempt loop 30582 1726855383.51549: running the handler 30582 1726855383.51562: _low_level_execute_command(): starting 30582 1726855383.51572: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855383.52077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855383.52081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855383.52084: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.52137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855383.52140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855383.52143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855383.52214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855383.53877: stdout chunk (state=3): >>>/root <<< 30582 1726855383.53979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855383.54008: stderr chunk (state=3): >>><<< 30582 1726855383.54012: stdout chunk (state=3): >>><<< 30582 1726855383.54032: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855383.54042: _low_level_execute_command(): starting 30582 1726855383.54048: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915 `" && echo ansible-tmp-1726855383.5403006-35923-188211738572915="` echo /root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915 `" ) && sleep 0' 30582 1726855383.54467: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855383.54470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.54473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855383.54482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.54527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855383.54531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855383.54598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855383.56498: stdout chunk (state=3): >>>ansible-tmp-1726855383.5403006-35923-188211738572915=/root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915 <<< 30582 1726855383.56602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855383.56628: stderr chunk (state=3): >>><<< 30582 1726855383.56631: stdout chunk (state=3): >>><<< 30582 1726855383.56643: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855383.5403006-35923-188211738572915=/root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855383.56681: variable 'ansible_module_compression' from source: unknown 30582 1726855383.56722: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855383.56775: variable 'ansible_facts' from source: unknown 30582 1726855383.56895: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/AnsiballZ_package_facts.py 30582 1726855383.56994: Sending initial data 30582 1726855383.56997: Sent initial data (162 bytes) 30582 1726855383.57428: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855383.57431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855383.57433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.57435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855383.57437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.57492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855383.57501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855383.57555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855383.59122: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30582 1726855383.59126: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855383.59181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855383.59241: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpc5k0dbgs /root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/AnsiballZ_package_facts.py <<< 30582 1726855383.59247: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/AnsiballZ_package_facts.py" <<< 30582 1726855383.59301: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpc5k0dbgs" to remote "/root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/AnsiballZ_package_facts.py" <<< 30582 1726855383.59304: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/AnsiballZ_package_facts.py" <<< 30582 1726855383.60416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855383.60454: stderr chunk (state=3): >>><<< 30582 1726855383.60457: stdout chunk (state=3): >>><<< 30582 1726855383.60492: done transferring module to remote 30582 1726855383.60501: _low_level_execute_command(): starting 30582 1726855383.60505: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/ /root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/AnsiballZ_package_facts.py && sleep 0' 30582 1726855383.60941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855383.60945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855383.60947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.60949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855383.60951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.61005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855383.61008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855383.61073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855383.62845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855383.62870: stderr chunk (state=3): >>><<< 30582 1726855383.62874: stdout chunk (state=3): >>><<< 30582 1726855383.62893: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855383.62897: _low_level_execute_command(): starting 30582 1726855383.62901: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/AnsiballZ_package_facts.py && sleep 0' 30582 1726855383.63336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855383.63339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855383.63342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855383.63344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855383.63346: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855383.63400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855383.63403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855383.63470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855384.07541: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 30582 1726855384.07576: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 30582 1726855384.07605: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 30582 1726855384.07625: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30582 1726855384.07631: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30582 1726855384.07670: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30582 1726855384.07675: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30582 1726855384.07704: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30582 1726855384.07715: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30582 1726855384.07742: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30582 1726855384.07766: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30582 1726855384.07777: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855384.09508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855384.09541: stderr chunk (state=3): >>><<< 30582 1726855384.09543: stdout chunk (state=3): >>><<< 30582 1726855384.09583: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855384.10901: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855384.10918: _low_level_execute_command(): starting 30582 1726855384.10922: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855383.5403006-35923-188211738572915/ > /dev/null 2>&1 && sleep 0' 30582 1726855384.11378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855384.11383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855384.11386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855384.11389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855384.11391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855384.11393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855384.11442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855384.11446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855384.11452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855384.11515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855384.13377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855384.13405: stderr chunk (state=3): >>><<< 30582 1726855384.13408: stdout chunk (state=3): >>><<< 30582 1726855384.13421: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855384.13427: handler run complete 30582 1726855384.13885: variable 'ansible_facts' from source: unknown 30582 1726855384.14214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.15252: variable 'ansible_facts' from source: unknown 30582 1726855384.15496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.15874: attempt loop complete, returning result 30582 1726855384.15883: _execute() done 30582 1726855384.15886: dumping result to json 30582 1726855384.16002: done dumping result, returning 30582 1726855384.16010: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-000000002501] 30582 1726855384.16015: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002501 30582 1726855384.17321: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002501 30582 1726855384.17325: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855384.17425: no more pending results, returning what we have 30582 1726855384.17427: results queue empty 30582 1726855384.17428: checking for any_errors_fatal 30582 1726855384.17434: done checking for any_errors_fatal 30582 1726855384.17434: checking for max_fail_percentage 30582 1726855384.17436: done checking for max_fail_percentage 30582 1726855384.17437: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.17437: done checking to see if all hosts have failed 30582 1726855384.17437: getting the remaining hosts for this loop 30582 1726855384.17438: done getting the remaining hosts for this loop 30582 1726855384.17441: getting the next task for host managed_node3 30582 1726855384.17446: done getting next task for host managed_node3 30582 1726855384.17448: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855384.17452: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.17461: getting variables 30582 1726855384.17462: in VariableManager get_vars() 30582 1726855384.17491: Calling all_inventory to load vars for managed_node3 30582 1726855384.17493: Calling groups_inventory to load vars for managed_node3 30582 1726855384.17495: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.17501: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.17503: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.17504: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.18193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.19075: done with get_vars() 30582 1726855384.19093: done getting variables 30582 1726855384.19136: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:03:04 -0400 (0:00:00.691) 0:02:00.541 ****** 30582 1726855384.19160: entering _queue_task() for managed_node3/debug 30582 1726855384.19419: worker is 1 (out of 1 available) 30582 1726855384.19434: exiting _queue_task() for managed_node3/debug 30582 1726855384.19447: done queuing things up, now waiting for results queue to drain 30582 1726855384.19448: waiting for pending results... 30582 1726855384.19646: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855384.19734: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024a5 30582 1726855384.19746: variable 'ansible_search_path' from source: unknown 30582 1726855384.19749: variable 'ansible_search_path' from source: unknown 30582 1726855384.19782: calling self._execute() 30582 1726855384.19856: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.19860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.19868: variable 'omit' from source: magic vars 30582 1726855384.20151: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.20170: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.20175: variable 'omit' from source: magic vars 30582 1726855384.20214: variable 'omit' from source: magic vars 30582 1726855384.20284: variable 'network_provider' from source: set_fact 30582 1726855384.20300: variable 'omit' from source: magic vars 30582 1726855384.20332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855384.20359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855384.20377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855384.20392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855384.20403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855384.20426: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855384.20429: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.20433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.20506: Set connection var ansible_timeout to 10 30582 1726855384.20509: Set connection var ansible_connection to ssh 30582 1726855384.20515: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855384.20519: Set connection var ansible_pipelining to False 30582 1726855384.20525: Set connection var ansible_shell_executable to /bin/sh 30582 1726855384.20527: Set connection var ansible_shell_type to sh 30582 1726855384.20544: variable 'ansible_shell_executable' from source: unknown 30582 1726855384.20547: variable 'ansible_connection' from source: unknown 30582 1726855384.20550: variable 'ansible_module_compression' from source: unknown 30582 1726855384.20552: variable 'ansible_shell_type' from source: unknown 30582 1726855384.20554: variable 'ansible_shell_executable' from source: unknown 30582 1726855384.20556: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.20558: variable 'ansible_pipelining' from source: unknown 30582 1726855384.20560: variable 'ansible_timeout' from source: unknown 30582 1726855384.20568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.20668: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855384.20677: variable 'omit' from source: magic vars 30582 1726855384.20681: starting attempt loop 30582 1726855384.20683: running the handler 30582 1726855384.20720: handler run complete 30582 1726855384.20731: attempt loop complete, returning result 30582 1726855384.20733: _execute() done 30582 1726855384.20736: dumping result to json 30582 1726855384.20739: done dumping result, returning 30582 1726855384.20746: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-0000000024a5] 30582 1726855384.20751: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a5 30582 1726855384.20837: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a5 30582 1726855384.20840: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855384.20916: no more pending results, returning what we have 30582 1726855384.20920: results queue empty 30582 1726855384.20921: checking for any_errors_fatal 30582 1726855384.20931: done checking for any_errors_fatal 30582 1726855384.20931: checking for max_fail_percentage 30582 1726855384.20933: done checking for max_fail_percentage 30582 1726855384.20934: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.20934: done checking to see if all hosts have failed 30582 1726855384.20935: getting the remaining hosts for this loop 30582 1726855384.20937: done getting the remaining hosts for this loop 30582 1726855384.20940: getting the next task for host managed_node3 30582 1726855384.20948: done getting next task for host managed_node3 30582 1726855384.20956: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855384.20960: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.20975: getting variables 30582 1726855384.20976: in VariableManager get_vars() 30582 1726855384.21016: Calling all_inventory to load vars for managed_node3 30582 1726855384.21019: Calling groups_inventory to load vars for managed_node3 30582 1726855384.21021: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.21029: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.21032: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.21034: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.21913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.22788: done with get_vars() 30582 1726855384.22809: done getting variables 30582 1726855384.22852: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:03:04 -0400 (0:00:00.037) 0:02:00.578 ****** 30582 1726855384.22886: entering _queue_task() for managed_node3/fail 30582 1726855384.23147: worker is 1 (out of 1 available) 30582 1726855384.23165: exiting _queue_task() for managed_node3/fail 30582 1726855384.23178: done queuing things up, now waiting for results queue to drain 30582 1726855384.23180: waiting for pending results... 30582 1726855384.23374: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855384.23454: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024a6 30582 1726855384.23469: variable 'ansible_search_path' from source: unknown 30582 1726855384.23472: variable 'ansible_search_path' from source: unknown 30582 1726855384.23504: calling self._execute() 30582 1726855384.23583: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.23589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.23597: variable 'omit' from source: magic vars 30582 1726855384.23894: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.23905: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.23992: variable 'network_state' from source: role '' defaults 30582 1726855384.24002: Evaluated conditional (network_state != {}): False 30582 1726855384.24006: when evaluation is False, skipping this task 30582 1726855384.24008: _execute() done 30582 1726855384.24011: dumping result to json 30582 1726855384.24013: done dumping result, returning 30582 1726855384.24020: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-0000000024a6] 30582 1726855384.24025: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a6 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855384.24164: no more pending results, returning what we have 30582 1726855384.24168: results queue empty 30582 1726855384.24169: checking for any_errors_fatal 30582 1726855384.24176: done checking for any_errors_fatal 30582 1726855384.24177: checking for max_fail_percentage 30582 1726855384.24178: done checking for max_fail_percentage 30582 1726855384.24179: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.24180: done checking to see if all hosts have failed 30582 1726855384.24181: getting the remaining hosts for this loop 30582 1726855384.24182: done getting the remaining hosts for this loop 30582 1726855384.24185: getting the next task for host managed_node3 30582 1726855384.24198: done getting next task for host managed_node3 30582 1726855384.24201: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855384.24207: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.24241: getting variables 30582 1726855384.24243: in VariableManager get_vars() 30582 1726855384.24292: Calling all_inventory to load vars for managed_node3 30582 1726855384.24295: Calling groups_inventory to load vars for managed_node3 30582 1726855384.24298: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.24303: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a6 30582 1726855384.24305: WORKER PROCESS EXITING 30582 1726855384.24315: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.24318: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.24320: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.25119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.25991: done with get_vars() 30582 1726855384.26011: done getting variables 30582 1726855384.26055: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:03:04 -0400 (0:00:00.031) 0:02:00.610 ****** 30582 1726855384.26081: entering _queue_task() for managed_node3/fail 30582 1726855384.26340: worker is 1 (out of 1 available) 30582 1726855384.26354: exiting _queue_task() for managed_node3/fail 30582 1726855384.26367: done queuing things up, now waiting for results queue to drain 30582 1726855384.26369: waiting for pending results... 30582 1726855384.26563: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855384.26644: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024a7 30582 1726855384.26656: variable 'ansible_search_path' from source: unknown 30582 1726855384.26661: variable 'ansible_search_path' from source: unknown 30582 1726855384.26692: calling self._execute() 30582 1726855384.26774: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.26778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.26785: variable 'omit' from source: magic vars 30582 1726855384.27079: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.27089: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.27175: variable 'network_state' from source: role '' defaults 30582 1726855384.27186: Evaluated conditional (network_state != {}): False 30582 1726855384.27190: when evaluation is False, skipping this task 30582 1726855384.27193: _execute() done 30582 1726855384.27195: dumping result to json 30582 1726855384.27197: done dumping result, returning 30582 1726855384.27205: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-0000000024a7] 30582 1726855384.27210: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a7 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855384.27348: no more pending results, returning what we have 30582 1726855384.27352: results queue empty 30582 1726855384.27353: checking for any_errors_fatal 30582 1726855384.27360: done checking for any_errors_fatal 30582 1726855384.27360: checking for max_fail_percentage 30582 1726855384.27363: done checking for max_fail_percentage 30582 1726855384.27364: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.27364: done checking to see if all hosts have failed 30582 1726855384.27365: getting the remaining hosts for this loop 30582 1726855384.27366: done getting the remaining hosts for this loop 30582 1726855384.27370: getting the next task for host managed_node3 30582 1726855384.27378: done getting next task for host managed_node3 30582 1726855384.27382: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855384.27389: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.27424: getting variables 30582 1726855384.27426: in VariableManager get_vars() 30582 1726855384.27471: Calling all_inventory to load vars for managed_node3 30582 1726855384.27474: Calling groups_inventory to load vars for managed_node3 30582 1726855384.27476: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.27486: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.27494: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.27500: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a7 30582 1726855384.27502: WORKER PROCESS EXITING 30582 1726855384.27505: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.28419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.29282: done with get_vars() 30582 1726855384.29302: done getting variables 30582 1726855384.29348: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:03:04 -0400 (0:00:00.032) 0:02:00.643 ****** 30582 1726855384.29375: entering _queue_task() for managed_node3/fail 30582 1726855384.29639: worker is 1 (out of 1 available) 30582 1726855384.29653: exiting _queue_task() for managed_node3/fail 30582 1726855384.29665: done queuing things up, now waiting for results queue to drain 30582 1726855384.29667: waiting for pending results... 30582 1726855384.29863: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855384.29968: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024a8 30582 1726855384.29980: variable 'ansible_search_path' from source: unknown 30582 1726855384.29984: variable 'ansible_search_path' from source: unknown 30582 1726855384.30016: calling self._execute() 30582 1726855384.30090: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.30094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.30103: variable 'omit' from source: magic vars 30582 1726855384.30378: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.30389: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.30510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855384.32036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855384.32096: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855384.32124: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855384.32149: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855384.32174: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855384.32233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.32254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.32274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.32305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.32316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.32382: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.32398: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855384.32474: variable 'ansible_distribution' from source: facts 30582 1726855384.32477: variable '__network_rh_distros' from source: role '' defaults 30582 1726855384.32485: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855384.32644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.32662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.32681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.32708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.32719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.32753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.32771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.32789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.32813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.32825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.32853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.32872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.32890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.32915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.32926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.33114: variable 'network_connections' from source: include params 30582 1726855384.33123: variable 'interface' from source: play vars 30582 1726855384.33172: variable 'interface' from source: play vars 30582 1726855384.33180: variable 'network_state' from source: role '' defaults 30582 1726855384.33226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855384.33334: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855384.33360: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855384.33386: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855384.33408: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855384.33447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855384.33464: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855384.33491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.33509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855384.33528: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855384.33531: when evaluation is False, skipping this task 30582 1726855384.33534: _execute() done 30582 1726855384.33536: dumping result to json 30582 1726855384.33538: done dumping result, returning 30582 1726855384.33546: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-0000000024a8] 30582 1726855384.33550: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a8 30582 1726855384.33640: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a8 30582 1726855384.33643: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855384.33689: no more pending results, returning what we have 30582 1726855384.33692: results queue empty 30582 1726855384.33693: checking for any_errors_fatal 30582 1726855384.33701: done checking for any_errors_fatal 30582 1726855384.33702: checking for max_fail_percentage 30582 1726855384.33704: done checking for max_fail_percentage 30582 1726855384.33705: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.33706: done checking to see if all hosts have failed 30582 1726855384.33706: getting the remaining hosts for this loop 30582 1726855384.33708: done getting the remaining hosts for this loop 30582 1726855384.33711: getting the next task for host managed_node3 30582 1726855384.33720: done getting next task for host managed_node3 30582 1726855384.33724: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855384.33728: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.33759: getting variables 30582 1726855384.33760: in VariableManager get_vars() 30582 1726855384.33810: Calling all_inventory to load vars for managed_node3 30582 1726855384.33813: Calling groups_inventory to load vars for managed_node3 30582 1726855384.33815: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.33824: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.33827: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.33829: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.34649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.35649: done with get_vars() 30582 1726855384.35668: done getting variables 30582 1726855384.35713: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:03:04 -0400 (0:00:00.063) 0:02:00.707 ****** 30582 1726855384.35739: entering _queue_task() for managed_node3/dnf 30582 1726855384.36001: worker is 1 (out of 1 available) 30582 1726855384.36017: exiting _queue_task() for managed_node3/dnf 30582 1726855384.36030: done queuing things up, now waiting for results queue to drain 30582 1726855384.36031: waiting for pending results... 30582 1726855384.36233: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855384.36332: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024a9 30582 1726855384.36342: variable 'ansible_search_path' from source: unknown 30582 1726855384.36345: variable 'ansible_search_path' from source: unknown 30582 1726855384.36377: calling self._execute() 30582 1726855384.36453: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.36457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.36468: variable 'omit' from source: magic vars 30582 1726855384.36747: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.36756: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.36896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855384.43385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855384.43428: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855384.43456: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855384.43480: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855384.43511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855384.43562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.43583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.43603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.43629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.43639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.43719: variable 'ansible_distribution' from source: facts 30582 1726855384.43722: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.43734: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855384.43815: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855384.43900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.43916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.43932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.43956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.43968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.43999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.44015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.44031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.44055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.44067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.44093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.44111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.44127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.44150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.44160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.44258: variable 'network_connections' from source: include params 30582 1726855384.44268: variable 'interface' from source: play vars 30582 1726855384.44311: variable 'interface' from source: play vars 30582 1726855384.44356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855384.44461: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855384.44490: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855384.44512: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855384.44535: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855384.44565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855384.44579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855384.44601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.44618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855384.44646: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855384.44800: variable 'network_connections' from source: include params 30582 1726855384.44804: variable 'interface' from source: play vars 30582 1726855384.44847: variable 'interface' from source: play vars 30582 1726855384.44868: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855384.44872: when evaluation is False, skipping this task 30582 1726855384.44874: _execute() done 30582 1726855384.44876: dumping result to json 30582 1726855384.44878: done dumping result, returning 30582 1726855384.44883: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000024a9] 30582 1726855384.44885: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a9 30582 1726855384.44975: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024a9 30582 1726855384.44978: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855384.45020: no more pending results, returning what we have 30582 1726855384.45023: results queue empty 30582 1726855384.45024: checking for any_errors_fatal 30582 1726855384.45030: done checking for any_errors_fatal 30582 1726855384.45031: checking for max_fail_percentage 30582 1726855384.45033: done checking for max_fail_percentage 30582 1726855384.45034: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.45035: done checking to see if all hosts have failed 30582 1726855384.45035: getting the remaining hosts for this loop 30582 1726855384.45037: done getting the remaining hosts for this loop 30582 1726855384.45040: getting the next task for host managed_node3 30582 1726855384.45048: done getting next task for host managed_node3 30582 1726855384.45052: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855384.45057: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.45083: getting variables 30582 1726855384.45085: in VariableManager get_vars() 30582 1726855384.45128: Calling all_inventory to load vars for managed_node3 30582 1726855384.45130: Calling groups_inventory to load vars for managed_node3 30582 1726855384.45132: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.45141: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.45143: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.45145: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.50558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.51411: done with get_vars() 30582 1726855384.51428: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855384.51477: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:03:04 -0400 (0:00:00.157) 0:02:00.864 ****** 30582 1726855384.51501: entering _queue_task() for managed_node3/yum 30582 1726855384.51778: worker is 1 (out of 1 available) 30582 1726855384.51794: exiting _queue_task() for managed_node3/yum 30582 1726855384.51806: done queuing things up, now waiting for results queue to drain 30582 1726855384.51808: waiting for pending results... 30582 1726855384.51999: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855384.52103: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024aa 30582 1726855384.52115: variable 'ansible_search_path' from source: unknown 30582 1726855384.52119: variable 'ansible_search_path' from source: unknown 30582 1726855384.52150: calling self._execute() 30582 1726855384.52229: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.52234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.52242: variable 'omit' from source: magic vars 30582 1726855384.52529: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.52538: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.52665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855384.54206: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855384.54258: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855384.54286: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855384.54315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855384.54336: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855384.54395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.54416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.54435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.54460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.54472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.54551: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.54568: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855384.54571: when evaluation is False, skipping this task 30582 1726855384.54574: _execute() done 30582 1726855384.54576: dumping result to json 30582 1726855384.54578: done dumping result, returning 30582 1726855384.54586: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000024aa] 30582 1726855384.54593: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024aa 30582 1726855384.54690: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024aa 30582 1726855384.54693: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855384.54740: no more pending results, returning what we have 30582 1726855384.54744: results queue empty 30582 1726855384.54745: checking for any_errors_fatal 30582 1726855384.54752: done checking for any_errors_fatal 30582 1726855384.54753: checking for max_fail_percentage 30582 1726855384.54755: done checking for max_fail_percentage 30582 1726855384.54756: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.54756: done checking to see if all hosts have failed 30582 1726855384.54757: getting the remaining hosts for this loop 30582 1726855384.54759: done getting the remaining hosts for this loop 30582 1726855384.54762: getting the next task for host managed_node3 30582 1726855384.54774: done getting next task for host managed_node3 30582 1726855384.54778: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855384.54783: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.54815: getting variables 30582 1726855384.54817: in VariableManager get_vars() 30582 1726855384.54860: Calling all_inventory to load vars for managed_node3 30582 1726855384.54865: Calling groups_inventory to load vars for managed_node3 30582 1726855384.54867: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.54877: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.54880: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.54882: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.55695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.56688: done with get_vars() 30582 1726855384.56705: done getting variables 30582 1726855384.56748: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:03:04 -0400 (0:00:00.052) 0:02:00.917 ****** 30582 1726855384.56778: entering _queue_task() for managed_node3/fail 30582 1726855384.57025: worker is 1 (out of 1 available) 30582 1726855384.57040: exiting _queue_task() for managed_node3/fail 30582 1726855384.57052: done queuing things up, now waiting for results queue to drain 30582 1726855384.57054: waiting for pending results... 30582 1726855384.57237: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855384.57342: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024ab 30582 1726855384.57353: variable 'ansible_search_path' from source: unknown 30582 1726855384.57356: variable 'ansible_search_path' from source: unknown 30582 1726855384.57389: calling self._execute() 30582 1726855384.57460: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.57466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.57474: variable 'omit' from source: magic vars 30582 1726855384.57753: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.57762: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.57855: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855384.57985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855384.59475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855384.59529: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855384.59555: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855384.59582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855384.59604: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855384.59659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.59683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.59701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.59728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.59738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.59771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.59789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.59811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.59836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.59846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.59877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.59905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.59916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.59940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.59950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.60066: variable 'network_connections' from source: include params 30582 1726855384.60078: variable 'interface' from source: play vars 30582 1726855384.60125: variable 'interface' from source: play vars 30582 1726855384.60176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855384.60288: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855384.60323: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855384.60348: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855384.60371: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855384.60403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855384.60418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855384.60435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.60455: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855384.60496: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855384.60652: variable 'network_connections' from source: include params 30582 1726855384.60655: variable 'interface' from source: play vars 30582 1726855384.60704: variable 'interface' from source: play vars 30582 1726855384.60722: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855384.60726: when evaluation is False, skipping this task 30582 1726855384.60728: _execute() done 30582 1726855384.60731: dumping result to json 30582 1726855384.60733: done dumping result, returning 30582 1726855384.60740: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000024ab] 30582 1726855384.60745: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024ab 30582 1726855384.60848: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024ab 30582 1726855384.60851: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855384.60929: no more pending results, returning what we have 30582 1726855384.60933: results queue empty 30582 1726855384.60934: checking for any_errors_fatal 30582 1726855384.60941: done checking for any_errors_fatal 30582 1726855384.60942: checking for max_fail_percentage 30582 1726855384.60943: done checking for max_fail_percentage 30582 1726855384.60944: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.60945: done checking to see if all hosts have failed 30582 1726855384.60946: getting the remaining hosts for this loop 30582 1726855384.60947: done getting the remaining hosts for this loop 30582 1726855384.60951: getting the next task for host managed_node3 30582 1726855384.60958: done getting next task for host managed_node3 30582 1726855384.60962: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855384.60967: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.60996: getting variables 30582 1726855384.60998: in VariableManager get_vars() 30582 1726855384.61039: Calling all_inventory to load vars for managed_node3 30582 1726855384.61042: Calling groups_inventory to load vars for managed_node3 30582 1726855384.61044: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.61055: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.61057: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.61060: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.61904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.62784: done with get_vars() 30582 1726855384.62804: done getting variables 30582 1726855384.62848: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:03:04 -0400 (0:00:00.060) 0:02:00.978 ****** 30582 1726855384.62875: entering _queue_task() for managed_node3/package 30582 1726855384.63128: worker is 1 (out of 1 available) 30582 1726855384.63143: exiting _queue_task() for managed_node3/package 30582 1726855384.63154: done queuing things up, now waiting for results queue to drain 30582 1726855384.63156: waiting for pending results... 30582 1726855384.63350: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855384.63452: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024ac 30582 1726855384.63463: variable 'ansible_search_path' from source: unknown 30582 1726855384.63469: variable 'ansible_search_path' from source: unknown 30582 1726855384.63502: calling self._execute() 30582 1726855384.63580: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.63584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.63595: variable 'omit' from source: magic vars 30582 1726855384.63896: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.63906: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.64041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855384.64248: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855384.64288: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855384.64315: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855384.64376: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855384.64470: variable 'network_packages' from source: role '' defaults 30582 1726855384.64543: variable '__network_provider_setup' from source: role '' defaults 30582 1726855384.64552: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855384.64599: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855384.64607: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855384.64651: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855384.64765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855384.66397: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855384.66441: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855384.66469: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855384.66492: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855384.66512: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855384.66572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.66593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.66611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.66636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.66653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.66681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.66699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.66716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.66739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.66749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.66891: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855384.66975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.66994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.67010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.67034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.67044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.67110: variable 'ansible_python' from source: facts 30582 1726855384.67123: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855384.67179: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855384.67236: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855384.67320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.67337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.67353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.67379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.67390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.67423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.67443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.67460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.67485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.67497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.67591: variable 'network_connections' from source: include params 30582 1726855384.67596: variable 'interface' from source: play vars 30582 1726855384.67669: variable 'interface' from source: play vars 30582 1726855384.67717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855384.67736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855384.67759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.67782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855384.67820: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855384.67998: variable 'network_connections' from source: include params 30582 1726855384.68001: variable 'interface' from source: play vars 30582 1726855384.68068: variable 'interface' from source: play vars 30582 1726855384.68099: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855384.68150: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855384.68344: variable 'network_connections' from source: include params 30582 1726855384.68348: variable 'interface' from source: play vars 30582 1726855384.68397: variable 'interface' from source: play vars 30582 1726855384.68413: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855384.68467: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855384.68658: variable 'network_connections' from source: include params 30582 1726855384.68661: variable 'interface' from source: play vars 30582 1726855384.68707: variable 'interface' from source: play vars 30582 1726855384.68745: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855384.68786: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855384.68794: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855384.68841: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855384.68967: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855384.69255: variable 'network_connections' from source: include params 30582 1726855384.69258: variable 'interface' from source: play vars 30582 1726855384.69305: variable 'interface' from source: play vars 30582 1726855384.69311: variable 'ansible_distribution' from source: facts 30582 1726855384.69314: variable '__network_rh_distros' from source: role '' defaults 30582 1726855384.69321: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.69332: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855384.69436: variable 'ansible_distribution' from source: facts 30582 1726855384.69439: variable '__network_rh_distros' from source: role '' defaults 30582 1726855384.69444: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.69456: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855384.69561: variable 'ansible_distribution' from source: facts 30582 1726855384.69567: variable '__network_rh_distros' from source: role '' defaults 30582 1726855384.69570: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.69597: variable 'network_provider' from source: set_fact 30582 1726855384.69608: variable 'ansible_facts' from source: unknown 30582 1726855384.69969: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855384.69972: when evaluation is False, skipping this task 30582 1726855384.69975: _execute() done 30582 1726855384.69977: dumping result to json 30582 1726855384.69979: done dumping result, returning 30582 1726855384.69986: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-0000000024ac] 30582 1726855384.69992: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024ac 30582 1726855384.70092: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024ac skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855384.70140: no more pending results, returning what we have 30582 1726855384.70144: results queue empty 30582 1726855384.70145: checking for any_errors_fatal 30582 1726855384.70150: done checking for any_errors_fatal 30582 1726855384.70151: checking for max_fail_percentage 30582 1726855384.70153: done checking for max_fail_percentage 30582 1726855384.70154: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.70154: done checking to see if all hosts have failed 30582 1726855384.70155: getting the remaining hosts for this loop 30582 1726855384.70156: done getting the remaining hosts for this loop 30582 1726855384.70160: getting the next task for host managed_node3 30582 1726855384.70171: done getting next task for host managed_node3 30582 1726855384.70175: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855384.70180: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.70213: getting variables 30582 1726855384.70214: in VariableManager get_vars() 30582 1726855384.70260: Calling all_inventory to load vars for managed_node3 30582 1726855384.70270: Calling groups_inventory to load vars for managed_node3 30582 1726855384.70272: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.70282: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.70284: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.70286: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.70301: WORKER PROCESS EXITING 30582 1726855384.71306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.72171: done with get_vars() 30582 1726855384.72189: done getting variables 30582 1726855384.72231: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:03:04 -0400 (0:00:00.093) 0:02:01.072 ****** 30582 1726855384.72259: entering _queue_task() for managed_node3/package 30582 1726855384.72517: worker is 1 (out of 1 available) 30582 1726855384.72533: exiting _queue_task() for managed_node3/package 30582 1726855384.72545: done queuing things up, now waiting for results queue to drain 30582 1726855384.72547: waiting for pending results... 30582 1726855384.72739: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855384.72840: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024ad 30582 1726855384.72852: variable 'ansible_search_path' from source: unknown 30582 1726855384.72855: variable 'ansible_search_path' from source: unknown 30582 1726855384.72885: calling self._execute() 30582 1726855384.72960: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.72965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.72976: variable 'omit' from source: magic vars 30582 1726855384.73263: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.73275: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.73364: variable 'network_state' from source: role '' defaults 30582 1726855384.73375: Evaluated conditional (network_state != {}): False 30582 1726855384.73378: when evaluation is False, skipping this task 30582 1726855384.73381: _execute() done 30582 1726855384.73384: dumping result to json 30582 1726855384.73386: done dumping result, returning 30582 1726855384.73396: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-0000000024ad] 30582 1726855384.73400: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024ad 30582 1726855384.73493: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024ad 30582 1726855384.73496: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855384.73565: no more pending results, returning what we have 30582 1726855384.73568: results queue empty 30582 1726855384.73569: checking for any_errors_fatal 30582 1726855384.73578: done checking for any_errors_fatal 30582 1726855384.73578: checking for max_fail_percentage 30582 1726855384.73580: done checking for max_fail_percentage 30582 1726855384.73581: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.73581: done checking to see if all hosts have failed 30582 1726855384.73582: getting the remaining hosts for this loop 30582 1726855384.73584: done getting the remaining hosts for this loop 30582 1726855384.73589: getting the next task for host managed_node3 30582 1726855384.73597: done getting next task for host managed_node3 30582 1726855384.73601: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855384.73607: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.73631: getting variables 30582 1726855384.73633: in VariableManager get_vars() 30582 1726855384.73668: Calling all_inventory to load vars for managed_node3 30582 1726855384.73670: Calling groups_inventory to load vars for managed_node3 30582 1726855384.73672: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.73680: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.73683: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.73685: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.74441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.75330: done with get_vars() 30582 1726855384.75348: done getting variables 30582 1726855384.75393: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:03:04 -0400 (0:00:00.031) 0:02:01.104 ****** 30582 1726855384.75417: entering _queue_task() for managed_node3/package 30582 1726855384.75657: worker is 1 (out of 1 available) 30582 1726855384.75673: exiting _queue_task() for managed_node3/package 30582 1726855384.75686: done queuing things up, now waiting for results queue to drain 30582 1726855384.75689: waiting for pending results... 30582 1726855384.75875: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855384.75979: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024ae 30582 1726855384.75991: variable 'ansible_search_path' from source: unknown 30582 1726855384.75995: variable 'ansible_search_path' from source: unknown 30582 1726855384.76025: calling self._execute() 30582 1726855384.76098: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.76104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.76113: variable 'omit' from source: magic vars 30582 1726855384.76395: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.76404: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.76494: variable 'network_state' from source: role '' defaults 30582 1726855384.76503: Evaluated conditional (network_state != {}): False 30582 1726855384.76506: when evaluation is False, skipping this task 30582 1726855384.76509: _execute() done 30582 1726855384.76512: dumping result to json 30582 1726855384.76514: done dumping result, returning 30582 1726855384.76522: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-0000000024ae] 30582 1726855384.76526: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024ae 30582 1726855384.76617: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024ae 30582 1726855384.76619: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855384.76666: no more pending results, returning what we have 30582 1726855384.76670: results queue empty 30582 1726855384.76671: checking for any_errors_fatal 30582 1726855384.76679: done checking for any_errors_fatal 30582 1726855384.76679: checking for max_fail_percentage 30582 1726855384.76681: done checking for max_fail_percentage 30582 1726855384.76682: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.76683: done checking to see if all hosts have failed 30582 1726855384.76684: getting the remaining hosts for this loop 30582 1726855384.76685: done getting the remaining hosts for this loop 30582 1726855384.76691: getting the next task for host managed_node3 30582 1726855384.76699: done getting next task for host managed_node3 30582 1726855384.76703: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855384.76708: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.76737: getting variables 30582 1726855384.76739: in VariableManager get_vars() 30582 1726855384.76778: Calling all_inventory to load vars for managed_node3 30582 1726855384.76781: Calling groups_inventory to load vars for managed_node3 30582 1726855384.76783: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.76797: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.76800: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.76803: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.77711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.78580: done with get_vars() 30582 1726855384.78598: done getting variables 30582 1726855384.78642: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:03:04 -0400 (0:00:00.032) 0:02:01.136 ****** 30582 1726855384.78672: entering _queue_task() for managed_node3/service 30582 1726855384.78928: worker is 1 (out of 1 available) 30582 1726855384.78941: exiting _queue_task() for managed_node3/service 30582 1726855384.78954: done queuing things up, now waiting for results queue to drain 30582 1726855384.78955: waiting for pending results... 30582 1726855384.79147: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855384.79246: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024af 30582 1726855384.79257: variable 'ansible_search_path' from source: unknown 30582 1726855384.79260: variable 'ansible_search_path' from source: unknown 30582 1726855384.79295: calling self._execute() 30582 1726855384.79364: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.79372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.79381: variable 'omit' from source: magic vars 30582 1726855384.79680: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.79690: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.79784: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855384.79919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855384.81458: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855384.81514: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855384.81544: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855384.81577: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855384.81595: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855384.81651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.81674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.81696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.81722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.81734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.81766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.81784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.81806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.81830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.81841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.81872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.81889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.81909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.81933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.81943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.82058: variable 'network_connections' from source: include params 30582 1726855384.82071: variable 'interface' from source: play vars 30582 1726855384.82123: variable 'interface' from source: play vars 30582 1726855384.82171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855384.82283: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855384.82320: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855384.82343: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855384.82364: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855384.82397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855384.82412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855384.82429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.82447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855384.82490: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855384.82643: variable 'network_connections' from source: include params 30582 1726855384.82646: variable 'interface' from source: play vars 30582 1726855384.82694: variable 'interface' from source: play vars 30582 1726855384.82714: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855384.82717: when evaluation is False, skipping this task 30582 1726855384.82720: _execute() done 30582 1726855384.82722: dumping result to json 30582 1726855384.82725: done dumping result, returning 30582 1726855384.82732: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-0000000024af] 30582 1726855384.82737: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024af skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855384.82891: no more pending results, returning what we have 30582 1726855384.82895: results queue empty 30582 1726855384.82896: checking for any_errors_fatal 30582 1726855384.82902: done checking for any_errors_fatal 30582 1726855384.82903: checking for max_fail_percentage 30582 1726855384.82906: done checking for max_fail_percentage 30582 1726855384.82907: checking to see if all hosts have failed and the running result is not ok 30582 1726855384.82908: done checking to see if all hosts have failed 30582 1726855384.82908: getting the remaining hosts for this loop 30582 1726855384.82910: done getting the remaining hosts for this loop 30582 1726855384.82913: getting the next task for host managed_node3 30582 1726855384.82922: done getting next task for host managed_node3 30582 1726855384.82926: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855384.82930: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855384.82962: getting variables 30582 1726855384.82966: in VariableManager get_vars() 30582 1726855384.83011: Calling all_inventory to load vars for managed_node3 30582 1726855384.83014: Calling groups_inventory to load vars for managed_node3 30582 1726855384.83016: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855384.83026: Calling all_plugins_play to load vars for managed_node3 30582 1726855384.83028: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855384.83031: Calling groups_plugins_play to load vars for managed_node3 30582 1726855384.83600: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024af 30582 1726855384.83604: WORKER PROCESS EXITING 30582 1726855384.83895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855384.84910: done with get_vars() 30582 1726855384.84927: done getting variables 30582 1726855384.84976: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:03:04 -0400 (0:00:00.063) 0:02:01.199 ****** 30582 1726855384.85003: entering _queue_task() for managed_node3/service 30582 1726855384.85268: worker is 1 (out of 1 available) 30582 1726855384.85284: exiting _queue_task() for managed_node3/service 30582 1726855384.85298: done queuing things up, now waiting for results queue to drain 30582 1726855384.85300: waiting for pending results... 30582 1726855384.85485: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855384.85795: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b0 30582 1726855384.85800: variable 'ansible_search_path' from source: unknown 30582 1726855384.85803: variable 'ansible_search_path' from source: unknown 30582 1726855384.85806: calling self._execute() 30582 1726855384.85808: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.85811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.85814: variable 'omit' from source: magic vars 30582 1726855384.86174: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.86196: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855384.86373: variable 'network_provider' from source: set_fact 30582 1726855384.86390: variable 'network_state' from source: role '' defaults 30582 1726855384.86409: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855384.86423: variable 'omit' from source: magic vars 30582 1726855384.86490: variable 'omit' from source: magic vars 30582 1726855384.86524: variable 'network_service_name' from source: role '' defaults 30582 1726855384.86599: variable 'network_service_name' from source: role '' defaults 30582 1726855384.86713: variable '__network_provider_setup' from source: role '' defaults 30582 1726855384.86726: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855384.86795: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855384.86812: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855384.86880: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855384.87108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855384.88620: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855384.88673: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855384.88704: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855384.88732: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855384.88752: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855384.88812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.88836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.88853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.88880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.88892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.88926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.88943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.88961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.88990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.89002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.89160: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855384.89245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.89261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.89282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.89311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.89321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.89389: variable 'ansible_python' from source: facts 30582 1726855384.89402: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855384.89458: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855384.89514: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855384.89598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.89616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.89633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.89657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.89673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.89707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855384.89727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855384.89743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.89769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855384.89780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855384.89875: variable 'network_connections' from source: include params 30582 1726855384.89882: variable 'interface' from source: play vars 30582 1726855384.89937: variable 'interface' from source: play vars 30582 1726855384.90011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855384.90142: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855384.90178: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855384.90212: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855384.90241: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855384.90284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855384.90307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855384.90331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855384.90356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855384.90393: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855384.90571: variable 'network_connections' from source: include params 30582 1726855384.90574: variable 'interface' from source: play vars 30582 1726855384.90626: variable 'interface' from source: play vars 30582 1726855384.90652: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855384.90708: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855384.90891: variable 'network_connections' from source: include params 30582 1726855384.90898: variable 'interface' from source: play vars 30582 1726855384.90945: variable 'interface' from source: play vars 30582 1726855384.90962: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855384.91019: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855384.91203: variable 'network_connections' from source: include params 30582 1726855384.91207: variable 'interface' from source: play vars 30582 1726855384.91256: variable 'interface' from source: play vars 30582 1726855384.91292: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855384.91335: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855384.91339: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855384.91381: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855384.91518: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855384.91839: variable 'network_connections' from source: include params 30582 1726855384.91842: variable 'interface' from source: play vars 30582 1726855384.91890: variable 'interface' from source: play vars 30582 1726855384.91893: variable 'ansible_distribution' from source: facts 30582 1726855384.91897: variable '__network_rh_distros' from source: role '' defaults 30582 1726855384.91903: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.91914: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855384.92026: variable 'ansible_distribution' from source: facts 30582 1726855384.92029: variable '__network_rh_distros' from source: role '' defaults 30582 1726855384.92034: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.92045: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855384.92157: variable 'ansible_distribution' from source: facts 30582 1726855384.92160: variable '__network_rh_distros' from source: role '' defaults 30582 1726855384.92167: variable 'ansible_distribution_major_version' from source: facts 30582 1726855384.92194: variable 'network_provider' from source: set_fact 30582 1726855384.92211: variable 'omit' from source: magic vars 30582 1726855384.92232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855384.92253: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855384.92286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855384.92291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855384.92294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855384.92324: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855384.92327: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.92329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.92404: Set connection var ansible_timeout to 10 30582 1726855384.92407: Set connection var ansible_connection to ssh 30582 1726855384.92413: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855384.92418: Set connection var ansible_pipelining to False 30582 1726855384.92425: Set connection var ansible_shell_executable to /bin/sh 30582 1726855384.92427: Set connection var ansible_shell_type to sh 30582 1726855384.92444: variable 'ansible_shell_executable' from source: unknown 30582 1726855384.92446: variable 'ansible_connection' from source: unknown 30582 1726855384.92449: variable 'ansible_module_compression' from source: unknown 30582 1726855384.92451: variable 'ansible_shell_type' from source: unknown 30582 1726855384.92453: variable 'ansible_shell_executable' from source: unknown 30582 1726855384.92455: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855384.92460: variable 'ansible_pipelining' from source: unknown 30582 1726855384.92462: variable 'ansible_timeout' from source: unknown 30582 1726855384.92468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855384.92542: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855384.92551: variable 'omit' from source: magic vars 30582 1726855384.92557: starting attempt loop 30582 1726855384.92560: running the handler 30582 1726855384.92616: variable 'ansible_facts' from source: unknown 30582 1726855384.93106: _low_level_execute_command(): starting 30582 1726855384.93111: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855384.93603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855384.93607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855384.93610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855384.93614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855384.93666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855384.93669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855384.93671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855384.93737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855384.95441: stdout chunk (state=3): >>>/root <<< 30582 1726855384.95538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855384.95569: stderr chunk (state=3): >>><<< 30582 1726855384.95572: stdout chunk (state=3): >>><<< 30582 1726855384.95589: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855384.95598: _low_level_execute_command(): starting 30582 1726855384.95605: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766 `" && echo ansible-tmp-1726855384.9558835-35945-227547665484766="` echo /root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766 `" ) && sleep 0' 30582 1726855384.96037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855384.96040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855384.96043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855384.96045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855384.96047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855384.96093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855384.96114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855384.96171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855384.98084: stdout chunk (state=3): >>>ansible-tmp-1726855384.9558835-35945-227547665484766=/root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766 <<< 30582 1726855384.98190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855384.98217: stderr chunk (state=3): >>><<< 30582 1726855384.98220: stdout chunk (state=3): >>><<< 30582 1726855384.98237: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855384.9558835-35945-227547665484766=/root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855384.98264: variable 'ansible_module_compression' from source: unknown 30582 1726855384.98308: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855384.98358: variable 'ansible_facts' from source: unknown 30582 1726855384.98497: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/AnsiballZ_systemd.py 30582 1726855384.98602: Sending initial data 30582 1726855384.98605: Sent initial data (156 bytes) 30582 1726855384.99060: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855384.99063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855384.99069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855384.99072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855384.99074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855384.99121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855384.99135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855384.99193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855385.00756: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855385.00760: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855385.00809: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855385.00872: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpdkr1ufam /root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/AnsiballZ_systemd.py <<< 30582 1726855385.00877: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/AnsiballZ_systemd.py" <<< 30582 1726855385.00933: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpdkr1ufam" to remote "/root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/AnsiballZ_systemd.py" <<< 30582 1726855385.00936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/AnsiballZ_systemd.py" <<< 30582 1726855385.02062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855385.02106: stderr chunk (state=3): >>><<< 30582 1726855385.02109: stdout chunk (state=3): >>><<< 30582 1726855385.02149: done transferring module to remote 30582 1726855385.02158: _low_level_execute_command(): starting 30582 1726855385.02166: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/ /root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/AnsiballZ_systemd.py && sleep 0' 30582 1726855385.02620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855385.02623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855385.02626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.02628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855385.02630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855385.02632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.02686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855385.02698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855385.02700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855385.02750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855385.04524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855385.04549: stderr chunk (state=3): >>><<< 30582 1726855385.04552: stdout chunk (state=3): >>><<< 30582 1726855385.04569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855385.04573: _low_level_execute_command(): starting 30582 1726855385.04575: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/AnsiballZ_systemd.py && sleep 0' 30582 1726855385.05027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855385.05030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855385.05032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855385.05034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855385.05036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.05093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855385.05097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855385.05107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855385.05165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855385.34298: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10674176", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315220480", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2318602000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855385.34305: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.s<<< 30582 1726855385.34316: stdout chunk (state=3): >>>ocket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855385.36195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855385.36225: stderr chunk (state=3): >>><<< 30582 1726855385.36228: stdout chunk (state=3): >>><<< 30582 1726855385.36247: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10674176", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315220480", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2318602000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855385.36373: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855385.36391: _low_level_execute_command(): starting 30582 1726855385.36396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855384.9558835-35945-227547665484766/ > /dev/null 2>&1 && sleep 0' 30582 1726855385.36847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855385.36851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855385.36853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.36855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855385.36857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.36906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855385.36910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855385.36921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855385.36984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855385.38825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855385.38855: stderr chunk (state=3): >>><<< 30582 1726855385.38858: stdout chunk (state=3): >>><<< 30582 1726855385.38872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855385.38878: handler run complete 30582 1726855385.38918: attempt loop complete, returning result 30582 1726855385.38921: _execute() done 30582 1726855385.38924: dumping result to json 30582 1726855385.38939: done dumping result, returning 30582 1726855385.38947: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-0000000024b0] 30582 1726855385.38955: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b0 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855385.39259: no more pending results, returning what we have 30582 1726855385.39262: results queue empty 30582 1726855385.39266: checking for any_errors_fatal 30582 1726855385.39272: done checking for any_errors_fatal 30582 1726855385.39273: checking for max_fail_percentage 30582 1726855385.39275: done checking for max_fail_percentage 30582 1726855385.39276: checking to see if all hosts have failed and the running result is not ok 30582 1726855385.39276: done checking to see if all hosts have failed 30582 1726855385.39277: getting the remaining hosts for this loop 30582 1726855385.39279: done getting the remaining hosts for this loop 30582 1726855385.39282: getting the next task for host managed_node3 30582 1726855385.39291: done getting next task for host managed_node3 30582 1726855385.39295: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855385.39301: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855385.39315: getting variables 30582 1726855385.39317: in VariableManager get_vars() 30582 1726855385.39354: Calling all_inventory to load vars for managed_node3 30582 1726855385.39357: Calling groups_inventory to load vars for managed_node3 30582 1726855385.39359: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855385.39371: Calling all_plugins_play to load vars for managed_node3 30582 1726855385.39373: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855385.39376: Calling groups_plugins_play to load vars for managed_node3 30582 1726855385.39901: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b0 30582 1726855385.39905: WORKER PROCESS EXITING 30582 1726855385.40212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855385.41098: done with get_vars() 30582 1726855385.41115: done getting variables 30582 1726855385.41159: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:03:05 -0400 (0:00:00.561) 0:02:01.761 ****** 30582 1726855385.41192: entering _queue_task() for managed_node3/service 30582 1726855385.41445: worker is 1 (out of 1 available) 30582 1726855385.41460: exiting _queue_task() for managed_node3/service 30582 1726855385.41474: done queuing things up, now waiting for results queue to drain 30582 1726855385.41476: waiting for pending results... 30582 1726855385.41658: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855385.41757: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b1 30582 1726855385.41770: variable 'ansible_search_path' from source: unknown 30582 1726855385.41774: variable 'ansible_search_path' from source: unknown 30582 1726855385.41804: calling self._execute() 30582 1726855385.41878: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855385.41883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855385.41891: variable 'omit' from source: magic vars 30582 1726855385.42177: variable 'ansible_distribution_major_version' from source: facts 30582 1726855385.42186: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855385.42269: variable 'network_provider' from source: set_fact 30582 1726855385.42273: Evaluated conditional (network_provider == "nm"): True 30582 1726855385.42337: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855385.42403: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855385.42517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855385.44246: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855385.44292: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855385.44321: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855385.44345: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855385.44366: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855385.44425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855385.44447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855385.44467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855385.44493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855385.44504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855385.44538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855385.44556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855385.44573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855385.44600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855385.44611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855385.44639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855385.44658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855385.44675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855385.44701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855385.44711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855385.44810: variable 'network_connections' from source: include params 30582 1726855385.44820: variable 'interface' from source: play vars 30582 1726855385.44875: variable 'interface' from source: play vars 30582 1726855385.44920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855385.45042: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855385.45070: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855385.45097: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855385.45119: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855385.45149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855385.45166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855385.45182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855385.45204: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855385.45240: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855385.45392: variable 'network_connections' from source: include params 30582 1726855385.45397: variable 'interface' from source: play vars 30582 1726855385.45442: variable 'interface' from source: play vars 30582 1726855385.45466: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855385.45470: when evaluation is False, skipping this task 30582 1726855385.45472: _execute() done 30582 1726855385.45474: dumping result to json 30582 1726855385.45476: done dumping result, returning 30582 1726855385.45483: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-0000000024b1] 30582 1726855385.45496: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b1 30582 1726855385.45584: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b1 30582 1726855385.45592: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855385.45637: no more pending results, returning what we have 30582 1726855385.45640: results queue empty 30582 1726855385.45641: checking for any_errors_fatal 30582 1726855385.45660: done checking for any_errors_fatal 30582 1726855385.45661: checking for max_fail_percentage 30582 1726855385.45665: done checking for max_fail_percentage 30582 1726855385.45666: checking to see if all hosts have failed and the running result is not ok 30582 1726855385.45667: done checking to see if all hosts have failed 30582 1726855385.45667: getting the remaining hosts for this loop 30582 1726855385.45669: done getting the remaining hosts for this loop 30582 1726855385.45672: getting the next task for host managed_node3 30582 1726855385.45681: done getting next task for host managed_node3 30582 1726855385.45685: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855385.45692: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855385.45722: getting variables 30582 1726855385.45724: in VariableManager get_vars() 30582 1726855385.45769: Calling all_inventory to load vars for managed_node3 30582 1726855385.45772: Calling groups_inventory to load vars for managed_node3 30582 1726855385.45774: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855385.45784: Calling all_plugins_play to load vars for managed_node3 30582 1726855385.45792: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855385.45795: Calling groups_plugins_play to load vars for managed_node3 30582 1726855385.46751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855385.47619: done with get_vars() 30582 1726855385.47639: done getting variables 30582 1726855385.47686: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:03:05 -0400 (0:00:00.065) 0:02:01.827 ****** 30582 1726855385.47713: entering _queue_task() for managed_node3/service 30582 1726855385.47978: worker is 1 (out of 1 available) 30582 1726855385.47994: exiting _queue_task() for managed_node3/service 30582 1726855385.48007: done queuing things up, now waiting for results queue to drain 30582 1726855385.48009: waiting for pending results... 30582 1726855385.48194: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855385.48290: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b2 30582 1726855385.48305: variable 'ansible_search_path' from source: unknown 30582 1726855385.48309: variable 'ansible_search_path' from source: unknown 30582 1726855385.48340: calling self._execute() 30582 1726855385.48411: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855385.48414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855385.48425: variable 'omit' from source: magic vars 30582 1726855385.48708: variable 'ansible_distribution_major_version' from source: facts 30582 1726855385.48718: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855385.48798: variable 'network_provider' from source: set_fact 30582 1726855385.48804: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855385.48807: when evaluation is False, skipping this task 30582 1726855385.48810: _execute() done 30582 1726855385.48812: dumping result to json 30582 1726855385.48816: done dumping result, returning 30582 1726855385.48824: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-0000000024b2] 30582 1726855385.48831: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b2 30582 1726855385.48923: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b2 30582 1726855385.48926: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855385.48973: no more pending results, returning what we have 30582 1726855385.48976: results queue empty 30582 1726855385.48977: checking for any_errors_fatal 30582 1726855385.48985: done checking for any_errors_fatal 30582 1726855385.48986: checking for max_fail_percentage 30582 1726855385.48990: done checking for max_fail_percentage 30582 1726855385.48991: checking to see if all hosts have failed and the running result is not ok 30582 1726855385.48991: done checking to see if all hosts have failed 30582 1726855385.48992: getting the remaining hosts for this loop 30582 1726855385.48994: done getting the remaining hosts for this loop 30582 1726855385.48997: getting the next task for host managed_node3 30582 1726855385.49005: done getting next task for host managed_node3 30582 1726855385.49008: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855385.49014: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855385.49046: getting variables 30582 1726855385.49048: in VariableManager get_vars() 30582 1726855385.49090: Calling all_inventory to load vars for managed_node3 30582 1726855385.49093: Calling groups_inventory to load vars for managed_node3 30582 1726855385.49095: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855385.49104: Calling all_plugins_play to load vars for managed_node3 30582 1726855385.49106: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855385.49108: Calling groups_plugins_play to load vars for managed_node3 30582 1726855385.49880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855385.50754: done with get_vars() 30582 1726855385.50775: done getting variables 30582 1726855385.50818: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:03:05 -0400 (0:00:00.031) 0:02:01.858 ****** 30582 1726855385.50844: entering _queue_task() for managed_node3/copy 30582 1726855385.51093: worker is 1 (out of 1 available) 30582 1726855385.51108: exiting _queue_task() for managed_node3/copy 30582 1726855385.51120: done queuing things up, now waiting for results queue to drain 30582 1726855385.51122: waiting for pending results... 30582 1726855385.51305: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855385.51406: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b3 30582 1726855385.51416: variable 'ansible_search_path' from source: unknown 30582 1726855385.51419: variable 'ansible_search_path' from source: unknown 30582 1726855385.51447: calling self._execute() 30582 1726855385.51520: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855385.51523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855385.51531: variable 'omit' from source: magic vars 30582 1726855385.51809: variable 'ansible_distribution_major_version' from source: facts 30582 1726855385.51818: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855385.51904: variable 'network_provider' from source: set_fact 30582 1726855385.51909: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855385.51912: when evaluation is False, skipping this task 30582 1726855385.51915: _execute() done 30582 1726855385.51917: dumping result to json 30582 1726855385.51920: done dumping result, returning 30582 1726855385.51929: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-0000000024b3] 30582 1726855385.51934: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b3 30582 1726855385.52025: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b3 30582 1726855385.52028: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855385.52079: no more pending results, returning what we have 30582 1726855385.52082: results queue empty 30582 1726855385.52083: checking for any_errors_fatal 30582 1726855385.52093: done checking for any_errors_fatal 30582 1726855385.52094: checking for max_fail_percentage 30582 1726855385.52096: done checking for max_fail_percentage 30582 1726855385.52097: checking to see if all hosts have failed and the running result is not ok 30582 1726855385.52098: done checking to see if all hosts have failed 30582 1726855385.52098: getting the remaining hosts for this loop 30582 1726855385.52100: done getting the remaining hosts for this loop 30582 1726855385.52103: getting the next task for host managed_node3 30582 1726855385.52111: done getting next task for host managed_node3 30582 1726855385.52115: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855385.52120: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855385.52148: getting variables 30582 1726855385.52149: in VariableManager get_vars() 30582 1726855385.52194: Calling all_inventory to load vars for managed_node3 30582 1726855385.52196: Calling groups_inventory to load vars for managed_node3 30582 1726855385.52198: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855385.52207: Calling all_plugins_play to load vars for managed_node3 30582 1726855385.52210: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855385.52212: Calling groups_plugins_play to load vars for managed_node3 30582 1726855385.53107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855385.53983: done with get_vars() 30582 1726855385.54003: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:03:05 -0400 (0:00:00.032) 0:02:01.890 ****** 30582 1726855385.54069: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855385.54330: worker is 1 (out of 1 available) 30582 1726855385.54345: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855385.54357: done queuing things up, now waiting for results queue to drain 30582 1726855385.54359: waiting for pending results... 30582 1726855385.54547: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855385.54650: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b4 30582 1726855385.54665: variable 'ansible_search_path' from source: unknown 30582 1726855385.54669: variable 'ansible_search_path' from source: unknown 30582 1726855385.54699: calling self._execute() 30582 1726855385.54793: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855385.54797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855385.54802: variable 'omit' from source: magic vars 30582 1726855385.55069: variable 'ansible_distribution_major_version' from source: facts 30582 1726855385.55077: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855385.55082: variable 'omit' from source: magic vars 30582 1726855385.55124: variable 'omit' from source: magic vars 30582 1726855385.55238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855385.56696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855385.56739: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855385.56772: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855385.56795: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855385.56815: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855385.56880: variable 'network_provider' from source: set_fact 30582 1726855385.56970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855385.56992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855385.57010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855385.57036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855385.57047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855385.57104: variable 'omit' from source: magic vars 30582 1726855385.57176: variable 'omit' from source: magic vars 30582 1726855385.57248: variable 'network_connections' from source: include params 30582 1726855385.57259: variable 'interface' from source: play vars 30582 1726855385.57304: variable 'interface' from source: play vars 30582 1726855385.57406: variable 'omit' from source: magic vars 30582 1726855385.57412: variable '__lsr_ansible_managed' from source: task vars 30582 1726855385.57457: variable '__lsr_ansible_managed' from source: task vars 30582 1726855385.57590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855385.57724: Loaded config def from plugin (lookup/template) 30582 1726855385.57728: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855385.57751: File lookup term: get_ansible_managed.j2 30582 1726855385.57754: variable 'ansible_search_path' from source: unknown 30582 1726855385.57757: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855385.57770: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855385.57783: variable 'ansible_search_path' from source: unknown 30582 1726855385.61253: variable 'ansible_managed' from source: unknown 30582 1726855385.61340: variable 'omit' from source: magic vars 30582 1726855385.61362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855385.61386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855385.61401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855385.61414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855385.61422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855385.61445: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855385.61448: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855385.61452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855385.61518: Set connection var ansible_timeout to 10 30582 1726855385.61521: Set connection var ansible_connection to ssh 30582 1726855385.61526: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855385.61531: Set connection var ansible_pipelining to False 30582 1726855385.61536: Set connection var ansible_shell_executable to /bin/sh 30582 1726855385.61538: Set connection var ansible_shell_type to sh 30582 1726855385.61557: variable 'ansible_shell_executable' from source: unknown 30582 1726855385.61560: variable 'ansible_connection' from source: unknown 30582 1726855385.61565: variable 'ansible_module_compression' from source: unknown 30582 1726855385.61568: variable 'ansible_shell_type' from source: unknown 30582 1726855385.61570: variable 'ansible_shell_executable' from source: unknown 30582 1726855385.61572: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855385.61574: variable 'ansible_pipelining' from source: unknown 30582 1726855385.61577: variable 'ansible_timeout' from source: unknown 30582 1726855385.61579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855385.61669: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855385.61681: variable 'omit' from source: magic vars 30582 1726855385.61684: starting attempt loop 30582 1726855385.61686: running the handler 30582 1726855385.61697: _low_level_execute_command(): starting 30582 1726855385.61703: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855385.62193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855385.62199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.62201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855385.62212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855385.62223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.62259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855385.62262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855385.62273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855385.62347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855385.64043: stdout chunk (state=3): >>>/root <<< 30582 1726855385.64134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855385.64168: stderr chunk (state=3): >>><<< 30582 1726855385.64171: stdout chunk (state=3): >>><<< 30582 1726855385.64195: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855385.64206: _low_level_execute_command(): starting 30582 1726855385.64212: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720 `" && echo ansible-tmp-1726855385.641953-35959-37857941550720="` echo /root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720 `" ) && sleep 0' 30582 1726855385.64661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855385.64667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855385.64670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.64672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855385.64674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.64726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855385.64730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855385.64732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855385.64795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855385.66702: stdout chunk (state=3): >>>ansible-tmp-1726855385.641953-35959-37857941550720=/root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720 <<< 30582 1726855385.66805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855385.66835: stderr chunk (state=3): >>><<< 30582 1726855385.66841: stdout chunk (state=3): >>><<< 30582 1726855385.66856: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855385.641953-35959-37857941550720=/root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855385.66898: variable 'ansible_module_compression' from source: unknown 30582 1726855385.66937: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855385.66977: variable 'ansible_facts' from source: unknown 30582 1726855385.67068: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/AnsiballZ_network_connections.py 30582 1726855385.67163: Sending initial data 30582 1726855385.67166: Sent initial data (166 bytes) 30582 1726855385.67617: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855385.67624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855385.67627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.67629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855385.67631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.67678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855385.67681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855385.67747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855385.69314: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855385.69370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855385.69427: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpfqgu27gn /root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/AnsiballZ_network_connections.py <<< 30582 1726855385.69433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/AnsiballZ_network_connections.py" <<< 30582 1726855385.69489: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpfqgu27gn" to remote "/root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/AnsiballZ_network_connections.py" <<< 30582 1726855385.69492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/AnsiballZ_network_connections.py" <<< 30582 1726855385.70290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855385.70335: stderr chunk (state=3): >>><<< 30582 1726855385.70339: stdout chunk (state=3): >>><<< 30582 1726855385.70382: done transferring module to remote 30582 1726855385.70392: _low_level_execute_command(): starting 30582 1726855385.70397: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/ /root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/AnsiballZ_network_connections.py && sleep 0' 30582 1726855385.70848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855385.70851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855385.70854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.70856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855385.70858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855385.70860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.70913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855385.70919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855385.70921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855385.70976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855385.72778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855385.72809: stderr chunk (state=3): >>><<< 30582 1726855385.72812: stdout chunk (state=3): >>><<< 30582 1726855385.72825: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855385.72828: _low_level_execute_command(): starting 30582 1726855385.72833: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/AnsiballZ_network_connections.py && sleep 0' 30582 1726855385.73270: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855385.73273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.73275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855385.73277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855385.73283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855385.73331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855385.73335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855385.73339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855385.73403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855386.06902: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cc51mrt0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cc51mrt0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/02f79b0a-2569-4459-9e63-b8baa27c9d76: error=unknown <<< 30582 1726855386.07105: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855386.08966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855386.08995: stderr chunk (state=3): >>><<< 30582 1726855386.08998: stdout chunk (state=3): >>><<< 30582 1726855386.09015: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cc51mrt0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cc51mrt0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/02f79b0a-2569-4459-9e63-b8baa27c9d76: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855386.09045: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855386.09054: _low_level_execute_command(): starting 30582 1726855386.09058: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855385.641953-35959-37857941550720/ > /dev/null 2>&1 && sleep 0' 30582 1726855386.09512: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855386.09515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.09517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855386.09519: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855386.09521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.09578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855386.09581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855386.09586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855386.09647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855386.11527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855386.11553: stderr chunk (state=3): >>><<< 30582 1726855386.11566: stdout chunk (state=3): >>><<< 30582 1726855386.11578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855386.11584: handler run complete 30582 1726855386.11606: attempt loop complete, returning result 30582 1726855386.11609: _execute() done 30582 1726855386.11611: dumping result to json 30582 1726855386.11616: done dumping result, returning 30582 1726855386.11625: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-0000000024b4] 30582 1726855386.11628: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b4 30582 1726855386.11730: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b4 30582 1726855386.11733: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30582 1726855386.11839: no more pending results, returning what we have 30582 1726855386.11842: results queue empty 30582 1726855386.11843: checking for any_errors_fatal 30582 1726855386.11850: done checking for any_errors_fatal 30582 1726855386.11852: checking for max_fail_percentage 30582 1726855386.11854: done checking for max_fail_percentage 30582 1726855386.11855: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.11855: done checking to see if all hosts have failed 30582 1726855386.11856: getting the remaining hosts for this loop 30582 1726855386.11858: done getting the remaining hosts for this loop 30582 1726855386.11861: getting the next task for host managed_node3 30582 1726855386.11870: done getting next task for host managed_node3 30582 1726855386.11874: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855386.11879: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.11894: getting variables 30582 1726855386.11895: in VariableManager get_vars() 30582 1726855386.11938: Calling all_inventory to load vars for managed_node3 30582 1726855386.11941: Calling groups_inventory to load vars for managed_node3 30582 1726855386.11943: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.11953: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.11955: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.11958: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.12811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.13818: done with get_vars() 30582 1726855386.13839: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:03:06 -0400 (0:00:00.598) 0:02:02.489 ****** 30582 1726855386.13908: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855386.14189: worker is 1 (out of 1 available) 30582 1726855386.14203: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855386.14216: done queuing things up, now waiting for results queue to drain 30582 1726855386.14218: waiting for pending results... 30582 1726855386.14416: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855386.14499: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b5 30582 1726855386.14513: variable 'ansible_search_path' from source: unknown 30582 1726855386.14517: variable 'ansible_search_path' from source: unknown 30582 1726855386.14548: calling self._execute() 30582 1726855386.14628: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.14632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.14644: variable 'omit' from source: magic vars 30582 1726855386.14953: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.14963: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.15054: variable 'network_state' from source: role '' defaults 30582 1726855386.15063: Evaluated conditional (network_state != {}): False 30582 1726855386.15068: when evaluation is False, skipping this task 30582 1726855386.15071: _execute() done 30582 1726855386.15075: dumping result to json 30582 1726855386.15077: done dumping result, returning 30582 1726855386.15086: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-0000000024b5] 30582 1726855386.15094: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b5 30582 1726855386.15185: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b5 30582 1726855386.15192: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855386.15246: no more pending results, returning what we have 30582 1726855386.15251: results queue empty 30582 1726855386.15252: checking for any_errors_fatal 30582 1726855386.15266: done checking for any_errors_fatal 30582 1726855386.15267: checking for max_fail_percentage 30582 1726855386.15269: done checking for max_fail_percentage 30582 1726855386.15270: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.15270: done checking to see if all hosts have failed 30582 1726855386.15271: getting the remaining hosts for this loop 30582 1726855386.15273: done getting the remaining hosts for this loop 30582 1726855386.15277: getting the next task for host managed_node3 30582 1726855386.15285: done getting next task for host managed_node3 30582 1726855386.15291: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855386.15297: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.15332: getting variables 30582 1726855386.15333: in VariableManager get_vars() 30582 1726855386.15380: Calling all_inventory to load vars for managed_node3 30582 1726855386.15383: Calling groups_inventory to load vars for managed_node3 30582 1726855386.15384: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.15400: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.15403: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.15406: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.16234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.17112: done with get_vars() 30582 1726855386.17134: done getting variables 30582 1726855386.17180: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:03:06 -0400 (0:00:00.032) 0:02:02.522 ****** 30582 1726855386.17209: entering _queue_task() for managed_node3/debug 30582 1726855386.17485: worker is 1 (out of 1 available) 30582 1726855386.17502: exiting _queue_task() for managed_node3/debug 30582 1726855386.17515: done queuing things up, now waiting for results queue to drain 30582 1726855386.17517: waiting for pending results... 30582 1726855386.17715: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855386.17799: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b6 30582 1726855386.17812: variable 'ansible_search_path' from source: unknown 30582 1726855386.17816: variable 'ansible_search_path' from source: unknown 30582 1726855386.17843: calling self._execute() 30582 1726855386.17925: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.17929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.17938: variable 'omit' from source: magic vars 30582 1726855386.18240: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.18250: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.18257: variable 'omit' from source: magic vars 30582 1726855386.18311: variable 'omit' from source: magic vars 30582 1726855386.18335: variable 'omit' from source: magic vars 30582 1726855386.18373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855386.18403: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855386.18420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855386.18435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855386.18445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855386.18471: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855386.18474: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.18476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.18552: Set connection var ansible_timeout to 10 30582 1726855386.18556: Set connection var ansible_connection to ssh 30582 1726855386.18561: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855386.18568: Set connection var ansible_pipelining to False 30582 1726855386.18573: Set connection var ansible_shell_executable to /bin/sh 30582 1726855386.18576: Set connection var ansible_shell_type to sh 30582 1726855386.18595: variable 'ansible_shell_executable' from source: unknown 30582 1726855386.18598: variable 'ansible_connection' from source: unknown 30582 1726855386.18601: variable 'ansible_module_compression' from source: unknown 30582 1726855386.18603: variable 'ansible_shell_type' from source: unknown 30582 1726855386.18607: variable 'ansible_shell_executable' from source: unknown 30582 1726855386.18609: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.18611: variable 'ansible_pipelining' from source: unknown 30582 1726855386.18613: variable 'ansible_timeout' from source: unknown 30582 1726855386.18621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.18726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855386.18735: variable 'omit' from source: magic vars 30582 1726855386.18741: starting attempt loop 30582 1726855386.18743: running the handler 30582 1726855386.18845: variable '__network_connections_result' from source: set_fact 30582 1726855386.18892: handler run complete 30582 1726855386.18906: attempt loop complete, returning result 30582 1726855386.18909: _execute() done 30582 1726855386.18911: dumping result to json 30582 1726855386.18914: done dumping result, returning 30582 1726855386.18923: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-0000000024b6] 30582 1726855386.18928: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b6 30582 1726855386.19019: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b6 30582 1726855386.19022: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 30582 1726855386.19138: no more pending results, returning what we have 30582 1726855386.19141: results queue empty 30582 1726855386.19142: checking for any_errors_fatal 30582 1726855386.19148: done checking for any_errors_fatal 30582 1726855386.19148: checking for max_fail_percentage 30582 1726855386.19150: done checking for max_fail_percentage 30582 1726855386.19151: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.19152: done checking to see if all hosts have failed 30582 1726855386.19153: getting the remaining hosts for this loop 30582 1726855386.19154: done getting the remaining hosts for this loop 30582 1726855386.19158: getting the next task for host managed_node3 30582 1726855386.19166: done getting next task for host managed_node3 30582 1726855386.19170: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855386.19175: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.19189: getting variables 30582 1726855386.19191: in VariableManager get_vars() 30582 1726855386.19236: Calling all_inventory to load vars for managed_node3 30582 1726855386.19239: Calling groups_inventory to load vars for managed_node3 30582 1726855386.19241: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.19250: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.19252: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.19254: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.20247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.21111: done with get_vars() 30582 1726855386.21130: done getting variables 30582 1726855386.21179: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:03:06 -0400 (0:00:00.040) 0:02:02.562 ****** 30582 1726855386.21212: entering _queue_task() for managed_node3/debug 30582 1726855386.21480: worker is 1 (out of 1 available) 30582 1726855386.21497: exiting _queue_task() for managed_node3/debug 30582 1726855386.21510: done queuing things up, now waiting for results queue to drain 30582 1726855386.21511: waiting for pending results... 30582 1726855386.21707: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855386.21807: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b7 30582 1726855386.21818: variable 'ansible_search_path' from source: unknown 30582 1726855386.21822: variable 'ansible_search_path' from source: unknown 30582 1726855386.21853: calling self._execute() 30582 1726855386.21932: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.21936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.21944: variable 'omit' from source: magic vars 30582 1726855386.22243: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.22253: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.22259: variable 'omit' from source: magic vars 30582 1726855386.22311: variable 'omit' from source: magic vars 30582 1726855386.22334: variable 'omit' from source: magic vars 30582 1726855386.22366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855386.22399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855386.22416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855386.22431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855386.22441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855386.22465: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855386.22471: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.22474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.22550: Set connection var ansible_timeout to 10 30582 1726855386.22553: Set connection var ansible_connection to ssh 30582 1726855386.22558: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855386.22563: Set connection var ansible_pipelining to False 30582 1726855386.22571: Set connection var ansible_shell_executable to /bin/sh 30582 1726855386.22573: Set connection var ansible_shell_type to sh 30582 1726855386.22592: variable 'ansible_shell_executable' from source: unknown 30582 1726855386.22595: variable 'ansible_connection' from source: unknown 30582 1726855386.22598: variable 'ansible_module_compression' from source: unknown 30582 1726855386.22600: variable 'ansible_shell_type' from source: unknown 30582 1726855386.22604: variable 'ansible_shell_executable' from source: unknown 30582 1726855386.22606: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.22608: variable 'ansible_pipelining' from source: unknown 30582 1726855386.22611: variable 'ansible_timeout' from source: unknown 30582 1726855386.22618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.22718: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855386.22791: variable 'omit' from source: magic vars 30582 1726855386.22793: starting attempt loop 30582 1726855386.22795: running the handler 30582 1726855386.22796: variable '__network_connections_result' from source: set_fact 30582 1726855386.22844: variable '__network_connections_result' from source: set_fact 30582 1726855386.22924: handler run complete 30582 1726855386.22942: attempt loop complete, returning result 30582 1726855386.22945: _execute() done 30582 1726855386.22947: dumping result to json 30582 1726855386.22949: done dumping result, returning 30582 1726855386.22959: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-0000000024b7] 30582 1726855386.22964: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b7 30582 1726855386.23062: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b7 30582 1726855386.23064: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30582 1726855386.23158: no more pending results, returning what we have 30582 1726855386.23162: results queue empty 30582 1726855386.23163: checking for any_errors_fatal 30582 1726855386.23170: done checking for any_errors_fatal 30582 1726855386.23171: checking for max_fail_percentage 30582 1726855386.23173: done checking for max_fail_percentage 30582 1726855386.23174: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.23174: done checking to see if all hosts have failed 30582 1726855386.23175: getting the remaining hosts for this loop 30582 1726855386.23177: done getting the remaining hosts for this loop 30582 1726855386.23180: getting the next task for host managed_node3 30582 1726855386.23190: done getting next task for host managed_node3 30582 1726855386.23194: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855386.23199: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.23212: getting variables 30582 1726855386.23214: in VariableManager get_vars() 30582 1726855386.23254: Calling all_inventory to load vars for managed_node3 30582 1726855386.23256: Calling groups_inventory to load vars for managed_node3 30582 1726855386.23258: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.23268: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.23270: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.23279: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.24103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.24980: done with get_vars() 30582 1726855386.25007: done getting variables 30582 1726855386.25054: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:03:06 -0400 (0:00:00.038) 0:02:02.600 ****** 30582 1726855386.25081: entering _queue_task() for managed_node3/debug 30582 1726855386.25350: worker is 1 (out of 1 available) 30582 1726855386.25364: exiting _queue_task() for managed_node3/debug 30582 1726855386.25377: done queuing things up, now waiting for results queue to drain 30582 1726855386.25379: waiting for pending results... 30582 1726855386.25580: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855386.25679: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b8 30582 1726855386.25694: variable 'ansible_search_path' from source: unknown 30582 1726855386.25698: variable 'ansible_search_path' from source: unknown 30582 1726855386.25729: calling self._execute() 30582 1726855386.25807: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.25811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.25822: variable 'omit' from source: magic vars 30582 1726855386.26117: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.26126: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.26215: variable 'network_state' from source: role '' defaults 30582 1726855386.26225: Evaluated conditional (network_state != {}): False 30582 1726855386.26228: when evaluation is False, skipping this task 30582 1726855386.26231: _execute() done 30582 1726855386.26233: dumping result to json 30582 1726855386.26236: done dumping result, returning 30582 1726855386.26244: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-0000000024b8] 30582 1726855386.26250: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b8 30582 1726855386.26346: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b8 30582 1726855386.26349: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855386.26406: no more pending results, returning what we have 30582 1726855386.26410: results queue empty 30582 1726855386.26411: checking for any_errors_fatal 30582 1726855386.26423: done checking for any_errors_fatal 30582 1726855386.26423: checking for max_fail_percentage 30582 1726855386.26425: done checking for max_fail_percentage 30582 1726855386.26426: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.26427: done checking to see if all hosts have failed 30582 1726855386.26428: getting the remaining hosts for this loop 30582 1726855386.26429: done getting the remaining hosts for this loop 30582 1726855386.26432: getting the next task for host managed_node3 30582 1726855386.26441: done getting next task for host managed_node3 30582 1726855386.26444: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855386.26451: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.26482: getting variables 30582 1726855386.26484: in VariableManager get_vars() 30582 1726855386.26527: Calling all_inventory to load vars for managed_node3 30582 1726855386.26530: Calling groups_inventory to load vars for managed_node3 30582 1726855386.26532: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.26542: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.26544: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.26547: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.27495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.28360: done with get_vars() 30582 1726855386.28386: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:03:06 -0400 (0:00:00.033) 0:02:02.634 ****** 30582 1726855386.28463: entering _queue_task() for managed_node3/ping 30582 1726855386.28741: worker is 1 (out of 1 available) 30582 1726855386.28755: exiting _queue_task() for managed_node3/ping 30582 1726855386.28767: done queuing things up, now waiting for results queue to drain 30582 1726855386.28769: waiting for pending results... 30582 1726855386.28970: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855386.29073: in run() - task 0affcc66-ac2b-aa83-7d57-0000000024b9 30582 1726855386.29085: variable 'ansible_search_path' from source: unknown 30582 1726855386.29092: variable 'ansible_search_path' from source: unknown 30582 1726855386.29122: calling self._execute() 30582 1726855386.29202: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.29207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.29216: variable 'omit' from source: magic vars 30582 1726855386.29505: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.29514: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.29520: variable 'omit' from source: magic vars 30582 1726855386.29572: variable 'omit' from source: magic vars 30582 1726855386.29598: variable 'omit' from source: magic vars 30582 1726855386.29633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855386.29662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855386.29681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855386.29696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855386.29706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855386.29729: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855386.29732: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.29735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.29811: Set connection var ansible_timeout to 10 30582 1726855386.29814: Set connection var ansible_connection to ssh 30582 1726855386.29819: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855386.29824: Set connection var ansible_pipelining to False 30582 1726855386.29829: Set connection var ansible_shell_executable to /bin/sh 30582 1726855386.29831: Set connection var ansible_shell_type to sh 30582 1726855386.29848: variable 'ansible_shell_executable' from source: unknown 30582 1726855386.29852: variable 'ansible_connection' from source: unknown 30582 1726855386.29855: variable 'ansible_module_compression' from source: unknown 30582 1726855386.29857: variable 'ansible_shell_type' from source: unknown 30582 1726855386.29861: variable 'ansible_shell_executable' from source: unknown 30582 1726855386.29864: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.29866: variable 'ansible_pipelining' from source: unknown 30582 1726855386.29873: variable 'ansible_timeout' from source: unknown 30582 1726855386.29875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.30023: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855386.30031: variable 'omit' from source: magic vars 30582 1726855386.30036: starting attempt loop 30582 1726855386.30039: running the handler 30582 1726855386.30050: _low_level_execute_command(): starting 30582 1726855386.30057: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855386.30550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855386.30580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.30583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855386.30590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.30639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855386.30642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855386.30645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855386.30717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855386.32409: stdout chunk (state=3): >>>/root <<< 30582 1726855386.32503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855386.32535: stderr chunk (state=3): >>><<< 30582 1726855386.32538: stdout chunk (state=3): >>><<< 30582 1726855386.32561: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855386.32574: _low_level_execute_command(): starting 30582 1726855386.32581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524 `" && echo ansible-tmp-1726855386.3256137-35974-118806611949524="` echo /root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524 `" ) && sleep 0' 30582 1726855386.33037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855386.33040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855386.33043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.33055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855386.33058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.33102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855386.33105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855386.33175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855386.35085: stdout chunk (state=3): >>>ansible-tmp-1726855386.3256137-35974-118806611949524=/root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524 <<< 30582 1726855386.35194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855386.35224: stderr chunk (state=3): >>><<< 30582 1726855386.35227: stdout chunk (state=3): >>><<< 30582 1726855386.35243: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855386.3256137-35974-118806611949524=/root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855386.35283: variable 'ansible_module_compression' from source: unknown 30582 1726855386.35322: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855386.35352: variable 'ansible_facts' from source: unknown 30582 1726855386.35409: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/AnsiballZ_ping.py 30582 1726855386.35511: Sending initial data 30582 1726855386.35514: Sent initial data (153 bytes) 30582 1726855386.35954: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855386.35957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855386.35961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.35963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855386.35965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.36020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855386.36024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855386.36028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855386.36085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855386.37643: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855386.37651: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855386.37701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855386.37759: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpg3ridi5s /root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/AnsiballZ_ping.py <<< 30582 1726855386.37765: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/AnsiballZ_ping.py" <<< 30582 1726855386.37818: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpg3ridi5s" to remote "/root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/AnsiballZ_ping.py" <<< 30582 1726855386.37821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/AnsiballZ_ping.py" <<< 30582 1726855386.38392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855386.38432: stderr chunk (state=3): >>><<< 30582 1726855386.38436: stdout chunk (state=3): >>><<< 30582 1726855386.38477: done transferring module to remote 30582 1726855386.38486: _low_level_execute_command(): starting 30582 1726855386.38491: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/ /root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/AnsiballZ_ping.py && sleep 0' 30582 1726855386.38933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855386.38937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855386.38939: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.38941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855386.38947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855386.38949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.38995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855386.39009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855386.39065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855386.40843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855386.40871: stderr chunk (state=3): >>><<< 30582 1726855386.40874: stdout chunk (state=3): >>><<< 30582 1726855386.40891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855386.40895: _low_level_execute_command(): starting 30582 1726855386.40900: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/AnsiballZ_ping.py && sleep 0' 30582 1726855386.41357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855386.41360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.41362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855386.41365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.41418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855386.41421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855386.41495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855386.56470: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855386.57802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855386.57831: stderr chunk (state=3): >>><<< 30582 1726855386.57834: stdout chunk (state=3): >>><<< 30582 1726855386.57849: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855386.57874: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855386.57882: _low_level_execute_command(): starting 30582 1726855386.57888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855386.3256137-35974-118806611949524/ > /dev/null 2>&1 && sleep 0' 30582 1726855386.58339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855386.58343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.58345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855386.58347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855386.58351: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855386.58401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855386.58404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855386.58468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855386.60315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855386.60340: stderr chunk (state=3): >>><<< 30582 1726855386.60343: stdout chunk (state=3): >>><<< 30582 1726855386.60356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855386.60365: handler run complete 30582 1726855386.60378: attempt loop complete, returning result 30582 1726855386.60381: _execute() done 30582 1726855386.60385: dumping result to json 30582 1726855386.60389: done dumping result, returning 30582 1726855386.60399: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-0000000024b9] 30582 1726855386.60402: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b9 30582 1726855386.60493: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000024b9 30582 1726855386.60496: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855386.60584: no more pending results, returning what we have 30582 1726855386.60590: results queue empty 30582 1726855386.60591: checking for any_errors_fatal 30582 1726855386.60601: done checking for any_errors_fatal 30582 1726855386.60602: checking for max_fail_percentage 30582 1726855386.60604: done checking for max_fail_percentage 30582 1726855386.60605: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.60607: done checking to see if all hosts have failed 30582 1726855386.60607: getting the remaining hosts for this loop 30582 1726855386.60609: done getting the remaining hosts for this loop 30582 1726855386.60612: getting the next task for host managed_node3 30582 1726855386.60624: done getting next task for host managed_node3 30582 1726855386.60626: ^ task is: TASK: meta (role_complete) 30582 1726855386.60631: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.60645: getting variables 30582 1726855386.60646: in VariableManager get_vars() 30582 1726855386.60697: Calling all_inventory to load vars for managed_node3 30582 1726855386.60700: Calling groups_inventory to load vars for managed_node3 30582 1726855386.60702: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.60712: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.60715: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.60717: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.61538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.62514: done with get_vars() 30582 1726855386.62533: done getting variables 30582 1726855386.62599: done queuing things up, now waiting for results queue to drain 30582 1726855386.62601: results queue empty 30582 1726855386.62601: checking for any_errors_fatal 30582 1726855386.62603: done checking for any_errors_fatal 30582 1726855386.62603: checking for max_fail_percentage 30582 1726855386.62604: done checking for max_fail_percentage 30582 1726855386.62605: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.62605: done checking to see if all hosts have failed 30582 1726855386.62606: getting the remaining hosts for this loop 30582 1726855386.62606: done getting the remaining hosts for this loop 30582 1726855386.62608: getting the next task for host managed_node3 30582 1726855386.62612: done getting next task for host managed_node3 30582 1726855386.62614: ^ task is: TASK: Test 30582 1726855386.62615: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.62617: getting variables 30582 1726855386.62617: in VariableManager get_vars() 30582 1726855386.62626: Calling all_inventory to load vars for managed_node3 30582 1726855386.62628: Calling groups_inventory to load vars for managed_node3 30582 1726855386.62629: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.62632: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.62634: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.62636: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.63270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.64119: done with get_vars() 30582 1726855386.64138: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 14:03:06 -0400 (0:00:00.357) 0:02:02.991 ****** 30582 1726855386.64200: entering _queue_task() for managed_node3/include_tasks 30582 1726855386.64482: worker is 1 (out of 1 available) 30582 1726855386.64497: exiting _queue_task() for managed_node3/include_tasks 30582 1726855386.64510: done queuing things up, now waiting for results queue to drain 30582 1726855386.64511: waiting for pending results... 30582 1726855386.64699: running TaskExecutor() for managed_node3/TASK: Test 30582 1726855386.64779: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020b1 30582 1726855386.64793: variable 'ansible_search_path' from source: unknown 30582 1726855386.64798: variable 'ansible_search_path' from source: unknown 30582 1726855386.64832: variable 'lsr_test' from source: include params 30582 1726855386.64999: variable 'lsr_test' from source: include params 30582 1726855386.65055: variable 'omit' from source: magic vars 30582 1726855386.65157: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.65167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.65176: variable 'omit' from source: magic vars 30582 1726855386.65350: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.65358: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.65366: variable 'item' from source: unknown 30582 1726855386.65414: variable 'item' from source: unknown 30582 1726855386.65435: variable 'item' from source: unknown 30582 1726855386.65479: variable 'item' from source: unknown 30582 1726855386.65624: dumping result to json 30582 1726855386.65627: done dumping result, returning 30582 1726855386.65629: done running TaskExecutor() for managed_node3/TASK: Test [0affcc66-ac2b-aa83-7d57-0000000020b1] 30582 1726855386.65630: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b1 30582 1726855386.65667: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b1 30582 1726855386.65670: WORKER PROCESS EXITING 30582 1726855386.65696: no more pending results, returning what we have 30582 1726855386.65700: in VariableManager get_vars() 30582 1726855386.65751: Calling all_inventory to load vars for managed_node3 30582 1726855386.65754: Calling groups_inventory to load vars for managed_node3 30582 1726855386.65756: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.65771: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.65774: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.65776: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.66727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.71931: done with get_vars() 30582 1726855386.71949: variable 'ansible_search_path' from source: unknown 30582 1726855386.71950: variable 'ansible_search_path' from source: unknown 30582 1726855386.71982: we have included files to process 30582 1726855386.71983: generating all_blocks data 30582 1726855386.71984: done generating all_blocks data 30582 1726855386.71986: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30582 1726855386.71986: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30582 1726855386.71989: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30582 1726855386.72059: done processing included file 30582 1726855386.72060: iterating over new_blocks loaded from include file 30582 1726855386.72061: in VariableManager get_vars() 30582 1726855386.72077: done with get_vars() 30582 1726855386.72078: filtering new block on tags 30582 1726855386.72096: done filtering new block on tags 30582 1726855386.72098: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node3 => (item=tasks/remove+down_profile.yml) 30582 1726855386.72101: extending task lists for all hosts with included blocks 30582 1726855386.72622: done extending task lists 30582 1726855386.72623: done processing included files 30582 1726855386.72624: results queue empty 30582 1726855386.72624: checking for any_errors_fatal 30582 1726855386.72626: done checking for any_errors_fatal 30582 1726855386.72626: checking for max_fail_percentage 30582 1726855386.72627: done checking for max_fail_percentage 30582 1726855386.72627: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.72628: done checking to see if all hosts have failed 30582 1726855386.72628: getting the remaining hosts for this loop 30582 1726855386.72629: done getting the remaining hosts for this loop 30582 1726855386.72631: getting the next task for host managed_node3 30582 1726855386.72633: done getting next task for host managed_node3 30582 1726855386.72634: ^ task is: TASK: Include network role 30582 1726855386.72636: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.72638: getting variables 30582 1726855386.72639: in VariableManager get_vars() 30582 1726855386.72646: Calling all_inventory to load vars for managed_node3 30582 1726855386.72648: Calling groups_inventory to load vars for managed_node3 30582 1726855386.72649: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.72656: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.72657: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.72659: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.73314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.74171: done with get_vars() 30582 1726855386.74190: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 14:03:06 -0400 (0:00:00.100) 0:02:03.092 ****** 30582 1726855386.74247: entering _queue_task() for managed_node3/include_role 30582 1726855386.74536: worker is 1 (out of 1 available) 30582 1726855386.74551: exiting _queue_task() for managed_node3/include_role 30582 1726855386.74565: done queuing things up, now waiting for results queue to drain 30582 1726855386.74567: waiting for pending results... 30582 1726855386.74753: running TaskExecutor() for managed_node3/TASK: Include network role 30582 1726855386.74841: in run() - task 0affcc66-ac2b-aa83-7d57-000000002612 30582 1726855386.74852: variable 'ansible_search_path' from source: unknown 30582 1726855386.74856: variable 'ansible_search_path' from source: unknown 30582 1726855386.74886: calling self._execute() 30582 1726855386.74960: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.74967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.74974: variable 'omit' from source: magic vars 30582 1726855386.75269: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.75277: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.75283: _execute() done 30582 1726855386.75286: dumping result to json 30582 1726855386.75290: done dumping result, returning 30582 1726855386.75298: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcc66-ac2b-aa83-7d57-000000002612] 30582 1726855386.75302: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002612 30582 1726855386.75436: no more pending results, returning what we have 30582 1726855386.75441: in VariableManager get_vars() 30582 1726855386.75495: Calling all_inventory to load vars for managed_node3 30582 1726855386.75499: Calling groups_inventory to load vars for managed_node3 30582 1726855386.75502: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.75515: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.75518: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.75520: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.76406: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002612 30582 1726855386.76410: WORKER PROCESS EXITING 30582 1726855386.76420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.77291: done with get_vars() 30582 1726855386.77306: variable 'ansible_search_path' from source: unknown 30582 1726855386.77306: variable 'ansible_search_path' from source: unknown 30582 1726855386.77394: variable 'omit' from source: magic vars 30582 1726855386.77422: variable 'omit' from source: magic vars 30582 1726855386.77433: variable 'omit' from source: magic vars 30582 1726855386.77436: we have included files to process 30582 1726855386.77436: generating all_blocks data 30582 1726855386.77437: done generating all_blocks data 30582 1726855386.77438: processing included file: fedora.linux_system_roles.network 30582 1726855386.77452: in VariableManager get_vars() 30582 1726855386.77465: done with get_vars() 30582 1726855386.77486: in VariableManager get_vars() 30582 1726855386.77500: done with get_vars() 30582 1726855386.77526: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30582 1726855386.77602: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30582 1726855386.77651: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30582 1726855386.77922: in VariableManager get_vars() 30582 1726855386.77936: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855386.79179: iterating over new_blocks loaded from include file 30582 1726855386.79181: in VariableManager get_vars() 30582 1726855386.79197: done with get_vars() 30582 1726855386.79198: filtering new block on tags 30582 1726855386.79356: done filtering new block on tags 30582 1726855386.79359: in VariableManager get_vars() 30582 1726855386.79372: done with get_vars() 30582 1726855386.79373: filtering new block on tags 30582 1726855386.79384: done filtering new block on tags 30582 1726855386.79385: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 30582 1726855386.79391: extending task lists for all hosts with included blocks 30582 1726855386.79456: done extending task lists 30582 1726855386.79457: done processing included files 30582 1726855386.79457: results queue empty 30582 1726855386.79458: checking for any_errors_fatal 30582 1726855386.79462: done checking for any_errors_fatal 30582 1726855386.79462: checking for max_fail_percentage 30582 1726855386.79465: done checking for max_fail_percentage 30582 1726855386.79466: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.79466: done checking to see if all hosts have failed 30582 1726855386.79466: getting the remaining hosts for this loop 30582 1726855386.79467: done getting the remaining hosts for this loop 30582 1726855386.79469: getting the next task for host managed_node3 30582 1726855386.79472: done getting next task for host managed_node3 30582 1726855386.79474: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855386.79476: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.79484: getting variables 30582 1726855386.79484: in VariableManager get_vars() 30582 1726855386.79496: Calling all_inventory to load vars for managed_node3 30582 1726855386.79497: Calling groups_inventory to load vars for managed_node3 30582 1726855386.79499: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.79504: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.79505: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.79507: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.80236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.81106: done with get_vars() 30582 1726855386.81131: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:03:06 -0400 (0:00:00.069) 0:02:03.161 ****** 30582 1726855386.81195: entering _queue_task() for managed_node3/include_tasks 30582 1726855386.81486: worker is 1 (out of 1 available) 30582 1726855386.81502: exiting _queue_task() for managed_node3/include_tasks 30582 1726855386.81516: done queuing things up, now waiting for results queue to drain 30582 1726855386.81518: waiting for pending results... 30582 1726855386.81706: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30582 1726855386.81791: in run() - task 0affcc66-ac2b-aa83-7d57-000000002694 30582 1726855386.81805: variable 'ansible_search_path' from source: unknown 30582 1726855386.81808: variable 'ansible_search_path' from source: unknown 30582 1726855386.81837: calling self._execute() 30582 1726855386.81915: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.81919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.81928: variable 'omit' from source: magic vars 30582 1726855386.82228: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.82237: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.82243: _execute() done 30582 1726855386.82246: dumping result to json 30582 1726855386.82249: done dumping result, returning 30582 1726855386.82258: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-aa83-7d57-000000002694] 30582 1726855386.82265: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002694 30582 1726855386.82352: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002694 30582 1726855386.82355: WORKER PROCESS EXITING 30582 1726855386.82416: no more pending results, returning what we have 30582 1726855386.82421: in VariableManager get_vars() 30582 1726855386.82478: Calling all_inventory to load vars for managed_node3 30582 1726855386.82481: Calling groups_inventory to load vars for managed_node3 30582 1726855386.82483: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.82501: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.82503: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.82506: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.83336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.84217: done with get_vars() 30582 1726855386.84236: variable 'ansible_search_path' from source: unknown 30582 1726855386.84237: variable 'ansible_search_path' from source: unknown 30582 1726855386.84268: we have included files to process 30582 1726855386.84269: generating all_blocks data 30582 1726855386.84270: done generating all_blocks data 30582 1726855386.84273: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855386.84273: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855386.84275: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30582 1726855386.84665: done processing included file 30582 1726855386.84666: iterating over new_blocks loaded from include file 30582 1726855386.84667: in VariableManager get_vars() 30582 1726855386.84686: done with get_vars() 30582 1726855386.84689: filtering new block on tags 30582 1726855386.84709: done filtering new block on tags 30582 1726855386.84711: in VariableManager get_vars() 30582 1726855386.84726: done with get_vars() 30582 1726855386.84727: filtering new block on tags 30582 1726855386.84753: done filtering new block on tags 30582 1726855386.84755: in VariableManager get_vars() 30582 1726855386.84773: done with get_vars() 30582 1726855386.84774: filtering new block on tags 30582 1726855386.84802: done filtering new block on tags 30582 1726855386.84803: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 30582 1726855386.84807: extending task lists for all hosts with included blocks 30582 1726855386.85804: done extending task lists 30582 1726855386.85806: done processing included files 30582 1726855386.85807: results queue empty 30582 1726855386.85807: checking for any_errors_fatal 30582 1726855386.85809: done checking for any_errors_fatal 30582 1726855386.85810: checking for max_fail_percentage 30582 1726855386.85810: done checking for max_fail_percentage 30582 1726855386.85811: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.85812: done checking to see if all hosts have failed 30582 1726855386.85812: getting the remaining hosts for this loop 30582 1726855386.85813: done getting the remaining hosts for this loop 30582 1726855386.85815: getting the next task for host managed_node3 30582 1726855386.85818: done getting next task for host managed_node3 30582 1726855386.85821: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855386.85823: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.85833: getting variables 30582 1726855386.85834: in VariableManager get_vars() 30582 1726855386.85848: Calling all_inventory to load vars for managed_node3 30582 1726855386.85850: Calling groups_inventory to load vars for managed_node3 30582 1726855386.85852: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.85857: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.85858: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.85860: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.86565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.87440: done with get_vars() 30582 1726855386.87464: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 14:03:06 -0400 (0:00:00.063) 0:02:03.225 ****** 30582 1726855386.87526: entering _queue_task() for managed_node3/setup 30582 1726855386.87828: worker is 1 (out of 1 available) 30582 1726855386.87841: exiting _queue_task() for managed_node3/setup 30582 1726855386.87854: done queuing things up, now waiting for results queue to drain 30582 1726855386.87856: waiting for pending results... 30582 1726855386.88045: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30582 1726855386.88140: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026eb 30582 1726855386.88154: variable 'ansible_search_path' from source: unknown 30582 1726855386.88158: variable 'ansible_search_path' from source: unknown 30582 1726855386.88196: calling self._execute() 30582 1726855386.88260: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.88268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.88272: variable 'omit' from source: magic vars 30582 1726855386.88797: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.88801: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.88956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855386.90848: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855386.90905: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855386.90933: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855386.90958: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855386.90982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855386.91044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855386.91065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855386.91089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855386.91116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855386.91127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855386.91165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855386.91189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855386.91203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855386.91228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855386.91238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855386.91357: variable '__network_required_facts' from source: role '' defaults 30582 1726855386.91365: variable 'ansible_facts' from source: unknown 30582 1726855386.91840: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30582 1726855386.91844: when evaluation is False, skipping this task 30582 1726855386.91846: _execute() done 30582 1726855386.91849: dumping result to json 30582 1726855386.91852: done dumping result, returning 30582 1726855386.91860: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-aa83-7d57-0000000026eb] 30582 1726855386.91864: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026eb 30582 1726855386.91957: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026eb 30582 1726855386.91960: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855386.92005: no more pending results, returning what we have 30582 1726855386.92009: results queue empty 30582 1726855386.92010: checking for any_errors_fatal 30582 1726855386.92011: done checking for any_errors_fatal 30582 1726855386.92012: checking for max_fail_percentage 30582 1726855386.92014: done checking for max_fail_percentage 30582 1726855386.92015: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.92016: done checking to see if all hosts have failed 30582 1726855386.92016: getting the remaining hosts for this loop 30582 1726855386.92018: done getting the remaining hosts for this loop 30582 1726855386.92022: getting the next task for host managed_node3 30582 1726855386.92034: done getting next task for host managed_node3 30582 1726855386.92037: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855386.92044: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.92074: getting variables 30582 1726855386.92076: in VariableManager get_vars() 30582 1726855386.92124: Calling all_inventory to load vars for managed_node3 30582 1726855386.92127: Calling groups_inventory to load vars for managed_node3 30582 1726855386.92129: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.92139: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.92142: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.92150: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.93016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.94039: done with get_vars() 30582 1726855386.94057: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 14:03:06 -0400 (0:00:00.066) 0:02:03.291 ****** 30582 1726855386.94131: entering _queue_task() for managed_node3/stat 30582 1726855386.94405: worker is 1 (out of 1 available) 30582 1726855386.94419: exiting _queue_task() for managed_node3/stat 30582 1726855386.94432: done queuing things up, now waiting for results queue to drain 30582 1726855386.94433: waiting for pending results... 30582 1726855386.94623: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 30582 1726855386.94724: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026ed 30582 1726855386.94737: variable 'ansible_search_path' from source: unknown 30582 1726855386.94740: variable 'ansible_search_path' from source: unknown 30582 1726855386.94768: calling self._execute() 30582 1726855386.94848: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.94852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.94860: variable 'omit' from source: magic vars 30582 1726855386.95152: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.95161: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.95281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855386.95482: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855386.95516: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855386.95544: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855386.95575: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855386.95643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855386.95662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855386.95682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855386.95702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855386.95775: variable '__network_is_ostree' from source: set_fact 30582 1726855386.95780: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855386.95783: when evaluation is False, skipping this task 30582 1726855386.95786: _execute() done 30582 1726855386.95790: dumping result to json 30582 1726855386.95795: done dumping result, returning 30582 1726855386.95803: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-aa83-7d57-0000000026ed] 30582 1726855386.95807: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026ed 30582 1726855386.95892: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026ed 30582 1726855386.95895: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855386.95943: no more pending results, returning what we have 30582 1726855386.95947: results queue empty 30582 1726855386.95948: checking for any_errors_fatal 30582 1726855386.95957: done checking for any_errors_fatal 30582 1726855386.95957: checking for max_fail_percentage 30582 1726855386.95959: done checking for max_fail_percentage 30582 1726855386.95960: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.95961: done checking to see if all hosts have failed 30582 1726855386.95962: getting the remaining hosts for this loop 30582 1726855386.95963: done getting the remaining hosts for this loop 30582 1726855386.95967: getting the next task for host managed_node3 30582 1726855386.95976: done getting next task for host managed_node3 30582 1726855386.95979: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855386.95984: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.96019: getting variables 30582 1726855386.96021: in VariableManager get_vars() 30582 1726855386.96063: Calling all_inventory to load vars for managed_node3 30582 1726855386.96066: Calling groups_inventory to load vars for managed_node3 30582 1726855386.96068: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.96078: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.96081: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.96084: Calling groups_plugins_play to load vars for managed_node3 30582 1726855386.96891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855386.97769: done with get_vars() 30582 1726855386.97786: done getting variables 30582 1726855386.97831: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 14:03:06 -0400 (0:00:00.037) 0:02:03.328 ****** 30582 1726855386.97860: entering _queue_task() for managed_node3/set_fact 30582 1726855386.98108: worker is 1 (out of 1 available) 30582 1726855386.98120: exiting _queue_task() for managed_node3/set_fact 30582 1726855386.98133: done queuing things up, now waiting for results queue to drain 30582 1726855386.98135: waiting for pending results... 30582 1726855386.98318: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30582 1726855386.98424: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026ee 30582 1726855386.98436: variable 'ansible_search_path' from source: unknown 30582 1726855386.98439: variable 'ansible_search_path' from source: unknown 30582 1726855386.98468: calling self._execute() 30582 1726855386.98538: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855386.98543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855386.98551: variable 'omit' from source: magic vars 30582 1726855386.98829: variable 'ansible_distribution_major_version' from source: facts 30582 1726855386.98841: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855386.98955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855386.99155: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855386.99191: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855386.99218: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855386.99246: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855386.99313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855386.99331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855386.99351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855386.99374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855386.99444: variable '__network_is_ostree' from source: set_fact 30582 1726855386.99451: Evaluated conditional (not __network_is_ostree is defined): False 30582 1726855386.99454: when evaluation is False, skipping this task 30582 1726855386.99456: _execute() done 30582 1726855386.99459: dumping result to json 30582 1726855386.99461: done dumping result, returning 30582 1726855386.99474: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-aa83-7d57-0000000026ee] 30582 1726855386.99476: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026ee 30582 1726855386.99558: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026ee 30582 1726855386.99562: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30582 1726855386.99616: no more pending results, returning what we have 30582 1726855386.99620: results queue empty 30582 1726855386.99621: checking for any_errors_fatal 30582 1726855386.99628: done checking for any_errors_fatal 30582 1726855386.99629: checking for max_fail_percentage 30582 1726855386.99631: done checking for max_fail_percentage 30582 1726855386.99632: checking to see if all hosts have failed and the running result is not ok 30582 1726855386.99632: done checking to see if all hosts have failed 30582 1726855386.99633: getting the remaining hosts for this loop 30582 1726855386.99634: done getting the remaining hosts for this loop 30582 1726855386.99638: getting the next task for host managed_node3 30582 1726855386.99649: done getting next task for host managed_node3 30582 1726855386.99653: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855386.99658: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855386.99685: getting variables 30582 1726855386.99688: in VariableManager get_vars() 30582 1726855386.99727: Calling all_inventory to load vars for managed_node3 30582 1726855386.99729: Calling groups_inventory to load vars for managed_node3 30582 1726855386.99731: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855386.99740: Calling all_plugins_play to load vars for managed_node3 30582 1726855386.99742: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855386.99745: Calling groups_plugins_play to load vars for managed_node3 30582 1726855387.00681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855387.01533: done with get_vars() 30582 1726855387.01550: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 14:03:07 -0400 (0:00:00.037) 0:02:03.366 ****** 30582 1726855387.01618: entering _queue_task() for managed_node3/service_facts 30582 1726855387.01865: worker is 1 (out of 1 available) 30582 1726855387.01880: exiting _queue_task() for managed_node3/service_facts 30582 1726855387.01894: done queuing things up, now waiting for results queue to drain 30582 1726855387.01896: waiting for pending results... 30582 1726855387.02084: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 30582 1726855387.02193: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026f0 30582 1726855387.02207: variable 'ansible_search_path' from source: unknown 30582 1726855387.02211: variable 'ansible_search_path' from source: unknown 30582 1726855387.02240: calling self._execute() 30582 1726855387.02312: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855387.02317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855387.02326: variable 'omit' from source: magic vars 30582 1726855387.02612: variable 'ansible_distribution_major_version' from source: facts 30582 1726855387.02621: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855387.02628: variable 'omit' from source: magic vars 30582 1726855387.02682: variable 'omit' from source: magic vars 30582 1726855387.02706: variable 'omit' from source: magic vars 30582 1726855387.02737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855387.02764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855387.02783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855387.02798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855387.02808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855387.02832: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855387.02835: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855387.02837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855387.02914: Set connection var ansible_timeout to 10 30582 1726855387.02917: Set connection var ansible_connection to ssh 30582 1726855387.02922: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855387.02927: Set connection var ansible_pipelining to False 30582 1726855387.02932: Set connection var ansible_shell_executable to /bin/sh 30582 1726855387.02934: Set connection var ansible_shell_type to sh 30582 1726855387.02950: variable 'ansible_shell_executable' from source: unknown 30582 1726855387.02953: variable 'ansible_connection' from source: unknown 30582 1726855387.02956: variable 'ansible_module_compression' from source: unknown 30582 1726855387.02958: variable 'ansible_shell_type' from source: unknown 30582 1726855387.02960: variable 'ansible_shell_executable' from source: unknown 30582 1726855387.02963: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855387.02969: variable 'ansible_pipelining' from source: unknown 30582 1726855387.02972: variable 'ansible_timeout' from source: unknown 30582 1726855387.02974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855387.03121: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855387.03131: variable 'omit' from source: magic vars 30582 1726855387.03136: starting attempt loop 30582 1726855387.03139: running the handler 30582 1726855387.03152: _low_level_execute_command(): starting 30582 1726855387.03158: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855387.03682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855387.03686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855387.03692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855387.03694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855387.03737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855387.03740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855387.03742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855387.03818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855387.05515: stdout chunk (state=3): >>>/root <<< 30582 1726855387.05618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855387.05648: stderr chunk (state=3): >>><<< 30582 1726855387.05651: stdout chunk (state=3): >>><<< 30582 1726855387.05672: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855387.05684: _low_level_execute_command(): starting 30582 1726855387.05691: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540 `" && echo ansible-tmp-1726855387.0567186-35991-197049478901540="` echo /root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540 `" ) && sleep 0' 30582 1726855387.06147: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855387.06150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855387.06160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855387.06162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855387.06164: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855387.06212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855387.06215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855387.06220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855387.06282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855387.08214: stdout chunk (state=3): >>>ansible-tmp-1726855387.0567186-35991-197049478901540=/root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540 <<< 30582 1726855387.08312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855387.08338: stderr chunk (state=3): >>><<< 30582 1726855387.08341: stdout chunk (state=3): >>><<< 30582 1726855387.08356: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855387.0567186-35991-197049478901540=/root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855387.08403: variable 'ansible_module_compression' from source: unknown 30582 1726855387.08445: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30582 1726855387.08478: variable 'ansible_facts' from source: unknown 30582 1726855387.08538: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/AnsiballZ_service_facts.py 30582 1726855387.08642: Sending initial data 30582 1726855387.08646: Sent initial data (162 bytes) 30582 1726855387.09112: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855387.09115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855387.09117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30582 1726855387.09120: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855387.09122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855387.09175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855387.09179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855387.09181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855387.09246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855387.10821: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855387.10874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855387.10938: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp3rzxc5tv /root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/AnsiballZ_service_facts.py <<< 30582 1726855387.10945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/AnsiballZ_service_facts.py" <<< 30582 1726855387.11001: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp3rzxc5tv" to remote "/root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/AnsiballZ_service_facts.py" <<< 30582 1726855387.11004: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/AnsiballZ_service_facts.py" <<< 30582 1726855387.11613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855387.11658: stderr chunk (state=3): >>><<< 30582 1726855387.11661: stdout chunk (state=3): >>><<< 30582 1726855387.11714: done transferring module to remote 30582 1726855387.11724: _low_level_execute_command(): starting 30582 1726855387.11727: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/ /root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/AnsiballZ_service_facts.py && sleep 0' 30582 1726855387.12181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855387.12185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855387.12192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855387.12199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855387.12202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855387.12246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855387.12252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855387.12256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855387.12313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855387.14082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855387.14111: stderr chunk (state=3): >>><<< 30582 1726855387.14116: stdout chunk (state=3): >>><<< 30582 1726855387.14129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855387.14133: _low_level_execute_command(): starting 30582 1726855387.14137: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/AnsiballZ_service_facts.py && sleep 0' 30582 1726855387.14586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855387.14592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855387.14594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855387.14596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855387.14635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855387.14645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855387.14720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855388.66980: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 30582 1726855388.67002: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 30582 1726855388.67018: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 30582 1726855388.67052: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 30582 1726855388.67061: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30582 1726855388.68522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855388.68553: stderr chunk (state=3): >>><<< 30582 1726855388.68556: stdout chunk (state=3): >>><<< 30582 1726855388.68592: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855388.69056: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855388.69064: _low_level_execute_command(): starting 30582 1726855388.69071: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855387.0567186-35991-197049478901540/ > /dev/null 2>&1 && sleep 0' 30582 1726855388.69526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855388.69530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855388.69532: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.69534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855388.69536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855388.69538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.69586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855388.69592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855388.69653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855388.71472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855388.71502: stderr chunk (state=3): >>><<< 30582 1726855388.71505: stdout chunk (state=3): >>><<< 30582 1726855388.71519: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855388.71524: handler run complete 30582 1726855388.71639: variable 'ansible_facts' from source: unknown 30582 1726855388.71736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855388.72014: variable 'ansible_facts' from source: unknown 30582 1726855388.72095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855388.72207: attempt loop complete, returning result 30582 1726855388.72212: _execute() done 30582 1726855388.72215: dumping result to json 30582 1726855388.72253: done dumping result, returning 30582 1726855388.72261: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-aa83-7d57-0000000026f0] 30582 1726855388.72267: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026f0 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855388.72922: no more pending results, returning what we have 30582 1726855388.72925: results queue empty 30582 1726855388.72926: checking for any_errors_fatal 30582 1726855388.72931: done checking for any_errors_fatal 30582 1726855388.72931: checking for max_fail_percentage 30582 1726855388.72933: done checking for max_fail_percentage 30582 1726855388.72933: checking to see if all hosts have failed and the running result is not ok 30582 1726855388.72934: done checking to see if all hosts have failed 30582 1726855388.72935: getting the remaining hosts for this loop 30582 1726855388.72936: done getting the remaining hosts for this loop 30582 1726855388.72939: getting the next task for host managed_node3 30582 1726855388.72944: done getting next task for host managed_node3 30582 1726855388.72947: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855388.72950: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855388.72959: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026f0 30582 1726855388.72962: WORKER PROCESS EXITING 30582 1726855388.72971: getting variables 30582 1726855388.72972: in VariableManager get_vars() 30582 1726855388.73002: Calling all_inventory to load vars for managed_node3 30582 1726855388.73004: Calling groups_inventory to load vars for managed_node3 30582 1726855388.73005: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855388.73012: Calling all_plugins_play to load vars for managed_node3 30582 1726855388.73013: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855388.73019: Calling groups_plugins_play to load vars for managed_node3 30582 1726855388.73847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855388.74730: done with get_vars() 30582 1726855388.74745: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 14:03:08 -0400 (0:00:01.731) 0:02:05.098 ****** 30582 1726855388.74820: entering _queue_task() for managed_node3/package_facts 30582 1726855388.75057: worker is 1 (out of 1 available) 30582 1726855388.75074: exiting _queue_task() for managed_node3/package_facts 30582 1726855388.75090: done queuing things up, now waiting for results queue to drain 30582 1726855388.75092: waiting for pending results... 30582 1726855388.75270: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 30582 1726855388.75360: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026f1 30582 1726855388.75374: variable 'ansible_search_path' from source: unknown 30582 1726855388.75377: variable 'ansible_search_path' from source: unknown 30582 1726855388.75405: calling self._execute() 30582 1726855388.75479: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855388.75483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855388.75492: variable 'omit' from source: magic vars 30582 1726855388.75778: variable 'ansible_distribution_major_version' from source: facts 30582 1726855388.75789: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855388.75796: variable 'omit' from source: magic vars 30582 1726855388.75845: variable 'omit' from source: magic vars 30582 1726855388.75871: variable 'omit' from source: magic vars 30582 1726855388.75904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855388.75929: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855388.75944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855388.75958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855388.75967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855388.75998: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855388.76001: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855388.76004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855388.76076: Set connection var ansible_timeout to 10 30582 1726855388.76079: Set connection var ansible_connection to ssh 30582 1726855388.76082: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855388.76094: Set connection var ansible_pipelining to False 30582 1726855388.76097: Set connection var ansible_shell_executable to /bin/sh 30582 1726855388.76100: Set connection var ansible_shell_type to sh 30582 1726855388.76113: variable 'ansible_shell_executable' from source: unknown 30582 1726855388.76116: variable 'ansible_connection' from source: unknown 30582 1726855388.76119: variable 'ansible_module_compression' from source: unknown 30582 1726855388.76121: variable 'ansible_shell_type' from source: unknown 30582 1726855388.76123: variable 'ansible_shell_executable' from source: unknown 30582 1726855388.76125: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855388.76129: variable 'ansible_pipelining' from source: unknown 30582 1726855388.76132: variable 'ansible_timeout' from source: unknown 30582 1726855388.76136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855388.76274: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855388.76282: variable 'omit' from source: magic vars 30582 1726855388.76289: starting attempt loop 30582 1726855388.76292: running the handler 30582 1726855388.76304: _low_level_execute_command(): starting 30582 1726855388.76311: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855388.76819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855388.76823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.76826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855388.76828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.76874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855388.76878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855388.76888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855388.76958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855388.78614: stdout chunk (state=3): >>>/root <<< 30582 1726855388.78715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855388.78741: stderr chunk (state=3): >>><<< 30582 1726855388.78744: stdout chunk (state=3): >>><<< 30582 1726855388.78763: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855388.78777: _low_level_execute_command(): starting 30582 1726855388.78783: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112 `" && echo ansible-tmp-1726855388.7876143-36002-95528525009112="` echo /root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112 `" ) && sleep 0' 30582 1726855388.79206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855388.79210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.79218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855388.79221: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855388.79223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.79262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855388.79268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855388.79333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855388.81225: stdout chunk (state=3): >>>ansible-tmp-1726855388.7876143-36002-95528525009112=/root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112 <<< 30582 1726855388.81338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855388.81365: stderr chunk (state=3): >>><<< 30582 1726855388.81368: stdout chunk (state=3): >>><<< 30582 1726855388.81382: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855388.7876143-36002-95528525009112=/root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855388.81419: variable 'ansible_module_compression' from source: unknown 30582 1726855388.81455: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30582 1726855388.81510: variable 'ansible_facts' from source: unknown 30582 1726855388.81626: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/AnsiballZ_package_facts.py 30582 1726855388.81723: Sending initial data 30582 1726855388.81727: Sent initial data (161 bytes) 30582 1726855388.82154: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855388.82157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855388.82159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855388.82166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855388.82168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.82212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855388.82215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855388.82279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855388.83821: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855388.83875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855388.83939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpufop1nca /root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/AnsiballZ_package_facts.py <<< 30582 1726855388.83942: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/AnsiballZ_package_facts.py" <<< 30582 1726855388.84000: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpufop1nca" to remote "/root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/AnsiballZ_package_facts.py" <<< 30582 1726855388.84003: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/AnsiballZ_package_facts.py" <<< 30582 1726855388.85085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855388.85126: stderr chunk (state=3): >>><<< 30582 1726855388.85130: stdout chunk (state=3): >>><<< 30582 1726855388.85164: done transferring module to remote 30582 1726855388.85175: _low_level_execute_command(): starting 30582 1726855388.85178: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/ /root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/AnsiballZ_package_facts.py && sleep 0' 30582 1726855388.85613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855388.85616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855388.85618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.85621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855388.85627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.85675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855388.85678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855388.85743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855388.87540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855388.87567: stderr chunk (state=3): >>><<< 30582 1726855388.87570: stdout chunk (state=3): >>><<< 30582 1726855388.87582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855388.87585: _low_level_execute_command(): starting 30582 1726855388.87591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/AnsiballZ_package_facts.py && sleep 0' 30582 1726855388.88023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855388.88027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.88029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855388.88031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855388.88033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855388.88082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855388.88086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855388.88092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855388.88159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855389.32479: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 30582 1726855389.32512: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 30582 1726855389.32534: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 30582 1726855389.32579: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 30582 1726855389.32584: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 30582 1726855389.32605: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 30582 1726855389.32633: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 30582 1726855389.32645: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 30582 1726855389.32649: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 30582 1726855389.32654: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 30582 1726855389.32682: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 30582 1726855389.32709: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 30582 1726855389.32723: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30582 1726855389.34382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855389.34412: stderr chunk (state=3): >>><<< 30582 1726855389.34415: stdout chunk (state=3): >>><<< 30582 1726855389.34451: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855389.35763: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855389.35782: _low_level_execute_command(): starting 30582 1726855389.35785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855388.7876143-36002-95528525009112/ > /dev/null 2>&1 && sleep 0' 30582 1726855389.36242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855389.36245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855389.36248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855389.36250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855389.36253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855389.36309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855389.36312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855389.36315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855389.36379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855389.38246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855389.38249: stdout chunk (state=3): >>><<< 30582 1726855389.38256: stderr chunk (state=3): >>><<< 30582 1726855389.38270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855389.38275: handler run complete 30582 1726855389.38729: variable 'ansible_facts' from source: unknown 30582 1726855389.39005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.40031: variable 'ansible_facts' from source: unknown 30582 1726855389.40341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.40714: attempt loop complete, returning result 30582 1726855389.40724: _execute() done 30582 1726855389.40727: dumping result to json 30582 1726855389.40840: done dumping result, returning 30582 1726855389.40848: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-aa83-7d57-0000000026f1] 30582 1726855389.40853: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026f1 30582 1726855389.42154: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026f1 30582 1726855389.42157: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855389.42249: no more pending results, returning what we have 30582 1726855389.42252: results queue empty 30582 1726855389.42253: checking for any_errors_fatal 30582 1726855389.42258: done checking for any_errors_fatal 30582 1726855389.42258: checking for max_fail_percentage 30582 1726855389.42260: done checking for max_fail_percentage 30582 1726855389.42260: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.42261: done checking to see if all hosts have failed 30582 1726855389.42261: getting the remaining hosts for this loop 30582 1726855389.42262: done getting the remaining hosts for this loop 30582 1726855389.42265: getting the next task for host managed_node3 30582 1726855389.42270: done getting next task for host managed_node3 30582 1726855389.42273: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855389.42277: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.42286: getting variables 30582 1726855389.42286: in VariableManager get_vars() 30582 1726855389.42319: Calling all_inventory to load vars for managed_node3 30582 1726855389.42321: Calling groups_inventory to load vars for managed_node3 30582 1726855389.42323: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.42330: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.42332: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.42333: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.43119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.44001: done with get_vars() 30582 1726855389.44029: done getting variables 30582 1726855389.44078: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:03:09 -0400 (0:00:00.692) 0:02:05.791 ****** 30582 1726855389.44109: entering _queue_task() for managed_node3/debug 30582 1726855389.44391: worker is 1 (out of 1 available) 30582 1726855389.44405: exiting _queue_task() for managed_node3/debug 30582 1726855389.44417: done queuing things up, now waiting for results queue to drain 30582 1726855389.44419: waiting for pending results... 30582 1726855389.44610: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 30582 1726855389.44699: in run() - task 0affcc66-ac2b-aa83-7d57-000000002695 30582 1726855389.44712: variable 'ansible_search_path' from source: unknown 30582 1726855389.44716: variable 'ansible_search_path' from source: unknown 30582 1726855389.44744: calling self._execute() 30582 1726855389.44818: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.44822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.44830: variable 'omit' from source: magic vars 30582 1726855389.45122: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.45132: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.45138: variable 'omit' from source: magic vars 30582 1726855389.45180: variable 'omit' from source: magic vars 30582 1726855389.45251: variable 'network_provider' from source: set_fact 30582 1726855389.45267: variable 'omit' from source: magic vars 30582 1726855389.45302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855389.45328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855389.45345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855389.45357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855389.45368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855389.45395: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855389.45400: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.45402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.45477: Set connection var ansible_timeout to 10 30582 1726855389.45481: Set connection var ansible_connection to ssh 30582 1726855389.45485: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855389.45492: Set connection var ansible_pipelining to False 30582 1726855389.45497: Set connection var ansible_shell_executable to /bin/sh 30582 1726855389.45499: Set connection var ansible_shell_type to sh 30582 1726855389.45520: variable 'ansible_shell_executable' from source: unknown 30582 1726855389.45523: variable 'ansible_connection' from source: unknown 30582 1726855389.45526: variable 'ansible_module_compression' from source: unknown 30582 1726855389.45528: variable 'ansible_shell_type' from source: unknown 30582 1726855389.45530: variable 'ansible_shell_executable' from source: unknown 30582 1726855389.45532: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.45534: variable 'ansible_pipelining' from source: unknown 30582 1726855389.45536: variable 'ansible_timeout' from source: unknown 30582 1726855389.45538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.45641: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855389.45649: variable 'omit' from source: magic vars 30582 1726855389.45654: starting attempt loop 30582 1726855389.45657: running the handler 30582 1726855389.45695: handler run complete 30582 1726855389.45706: attempt loop complete, returning result 30582 1726855389.45708: _execute() done 30582 1726855389.45711: dumping result to json 30582 1726855389.45713: done dumping result, returning 30582 1726855389.45721: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-aa83-7d57-000000002695] 30582 1726855389.45726: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002695 30582 1726855389.45815: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002695 30582 1726855389.45817: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 30582 1726855389.45916: no more pending results, returning what we have 30582 1726855389.45919: results queue empty 30582 1726855389.45920: checking for any_errors_fatal 30582 1726855389.45932: done checking for any_errors_fatal 30582 1726855389.45933: checking for max_fail_percentage 30582 1726855389.45935: done checking for max_fail_percentage 30582 1726855389.45935: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.45936: done checking to see if all hosts have failed 30582 1726855389.45937: getting the remaining hosts for this loop 30582 1726855389.45938: done getting the remaining hosts for this loop 30582 1726855389.45942: getting the next task for host managed_node3 30582 1726855389.45950: done getting next task for host managed_node3 30582 1726855389.45954: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855389.45958: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.45973: getting variables 30582 1726855389.45975: in VariableManager get_vars() 30582 1726855389.46016: Calling all_inventory to load vars for managed_node3 30582 1726855389.46019: Calling groups_inventory to load vars for managed_node3 30582 1726855389.46021: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.46030: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.46032: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.46035: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.46841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.47716: done with get_vars() 30582 1726855389.47735: done getting variables 30582 1726855389.47780: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:03:09 -0400 (0:00:00.037) 0:02:05.828 ****** 30582 1726855389.47814: entering _queue_task() for managed_node3/fail 30582 1726855389.48074: worker is 1 (out of 1 available) 30582 1726855389.48091: exiting _queue_task() for managed_node3/fail 30582 1726855389.48103: done queuing things up, now waiting for results queue to drain 30582 1726855389.48105: waiting for pending results... 30582 1726855389.48286: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30582 1726855389.48384: in run() - task 0affcc66-ac2b-aa83-7d57-000000002696 30582 1726855389.48402: variable 'ansible_search_path' from source: unknown 30582 1726855389.48406: variable 'ansible_search_path' from source: unknown 30582 1726855389.48433: calling self._execute() 30582 1726855389.48512: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.48516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.48524: variable 'omit' from source: magic vars 30582 1726855389.48822: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.48834: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.48926: variable 'network_state' from source: role '' defaults 30582 1726855389.48935: Evaluated conditional (network_state != {}): False 30582 1726855389.48938: when evaluation is False, skipping this task 30582 1726855389.48941: _execute() done 30582 1726855389.48943: dumping result to json 30582 1726855389.48945: done dumping result, returning 30582 1726855389.48953: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-aa83-7d57-000000002696] 30582 1726855389.48957: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002696 30582 1726855389.49053: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002696 30582 1726855389.49055: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855389.49132: no more pending results, returning what we have 30582 1726855389.49136: results queue empty 30582 1726855389.49137: checking for any_errors_fatal 30582 1726855389.49143: done checking for any_errors_fatal 30582 1726855389.49143: checking for max_fail_percentage 30582 1726855389.49145: done checking for max_fail_percentage 30582 1726855389.49146: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.49147: done checking to see if all hosts have failed 30582 1726855389.49147: getting the remaining hosts for this loop 30582 1726855389.49149: done getting the remaining hosts for this loop 30582 1726855389.49152: getting the next task for host managed_node3 30582 1726855389.49161: done getting next task for host managed_node3 30582 1726855389.49164: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855389.49168: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.49195: getting variables 30582 1726855389.49197: in VariableManager get_vars() 30582 1726855389.49234: Calling all_inventory to load vars for managed_node3 30582 1726855389.49237: Calling groups_inventory to load vars for managed_node3 30582 1726855389.49238: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.49247: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.49249: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.49252: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.50175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.51036: done with get_vars() 30582 1726855389.51057: done getting variables 30582 1726855389.51102: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:03:09 -0400 (0:00:00.033) 0:02:05.861 ****** 30582 1726855389.51128: entering _queue_task() for managed_node3/fail 30582 1726855389.51395: worker is 1 (out of 1 available) 30582 1726855389.51410: exiting _queue_task() for managed_node3/fail 30582 1726855389.51422: done queuing things up, now waiting for results queue to drain 30582 1726855389.51423: waiting for pending results... 30582 1726855389.51618: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30582 1726855389.51720: in run() - task 0affcc66-ac2b-aa83-7d57-000000002697 30582 1726855389.51731: variable 'ansible_search_path' from source: unknown 30582 1726855389.51734: variable 'ansible_search_path' from source: unknown 30582 1726855389.51765: calling self._execute() 30582 1726855389.51840: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.51844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.51853: variable 'omit' from source: magic vars 30582 1726855389.52144: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.52154: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.52244: variable 'network_state' from source: role '' defaults 30582 1726855389.52253: Evaluated conditional (network_state != {}): False 30582 1726855389.52256: when evaluation is False, skipping this task 30582 1726855389.52259: _execute() done 30582 1726855389.52261: dumping result to json 30582 1726855389.52264: done dumping result, returning 30582 1726855389.52274: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-aa83-7d57-000000002697] 30582 1726855389.52277: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002697 30582 1726855389.52369: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002697 30582 1726855389.52371: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855389.52450: no more pending results, returning what we have 30582 1726855389.52454: results queue empty 30582 1726855389.52455: checking for any_errors_fatal 30582 1726855389.52465: done checking for any_errors_fatal 30582 1726855389.52466: checking for max_fail_percentage 30582 1726855389.52467: done checking for max_fail_percentage 30582 1726855389.52468: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.52469: done checking to see if all hosts have failed 30582 1726855389.52470: getting the remaining hosts for this loop 30582 1726855389.52471: done getting the remaining hosts for this loop 30582 1726855389.52475: getting the next task for host managed_node3 30582 1726855389.52482: done getting next task for host managed_node3 30582 1726855389.52486: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855389.52493: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.52520: getting variables 30582 1726855389.52521: in VariableManager get_vars() 30582 1726855389.52559: Calling all_inventory to load vars for managed_node3 30582 1726855389.52562: Calling groups_inventory to load vars for managed_node3 30582 1726855389.52564: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.52573: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.52575: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.52577: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.53378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.54254: done with get_vars() 30582 1726855389.54272: done getting variables 30582 1726855389.54315: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:03:09 -0400 (0:00:00.032) 0:02:05.893 ****** 30582 1726855389.54341: entering _queue_task() for managed_node3/fail 30582 1726855389.54593: worker is 1 (out of 1 available) 30582 1726855389.54609: exiting _queue_task() for managed_node3/fail 30582 1726855389.54622: done queuing things up, now waiting for results queue to drain 30582 1726855389.54624: waiting for pending results... 30582 1726855389.54811: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30582 1726855389.54899: in run() - task 0affcc66-ac2b-aa83-7d57-000000002698 30582 1726855389.54910: variable 'ansible_search_path' from source: unknown 30582 1726855389.54913: variable 'ansible_search_path' from source: unknown 30582 1726855389.54941: calling self._execute() 30582 1726855389.55019: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.55023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.55031: variable 'omit' from source: magic vars 30582 1726855389.55316: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.55326: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.55445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855389.57203: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855389.57248: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855389.57290: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855389.57318: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855389.57338: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855389.57402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.57423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.57440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.57473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.57484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.57551: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.57567: Evaluated conditional (ansible_distribution_major_version | int > 9): True 30582 1726855389.57645: variable 'ansible_distribution' from source: facts 30582 1726855389.57649: variable '__network_rh_distros' from source: role '' defaults 30582 1726855389.57657: Evaluated conditional (ansible_distribution in __network_rh_distros): True 30582 1726855389.57817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.57834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.57851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.57878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.57891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.57924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.57940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.57956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.57982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.57994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.58026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.58041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.58057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.58083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.58095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.58281: variable 'network_connections' from source: include params 30582 1726855389.58290: variable 'interface' from source: play vars 30582 1726855389.58336: variable 'interface' from source: play vars 30582 1726855389.58346: variable 'network_state' from source: role '' defaults 30582 1726855389.58392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855389.58508: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855389.58535: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855389.58559: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855389.58581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855389.58612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855389.58627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855389.58647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.58773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855389.58777: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 30582 1726855389.58779: when evaluation is False, skipping this task 30582 1726855389.58781: _execute() done 30582 1726855389.58783: dumping result to json 30582 1726855389.58784: done dumping result, returning 30582 1726855389.58786: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-aa83-7d57-000000002698] 30582 1726855389.58789: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002698 30582 1726855389.58853: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002698 30582 1726855389.58856: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 30582 1726855389.58921: no more pending results, returning what we have 30582 1726855389.58924: results queue empty 30582 1726855389.58925: checking for any_errors_fatal 30582 1726855389.58931: done checking for any_errors_fatal 30582 1726855389.58931: checking for max_fail_percentage 30582 1726855389.58933: done checking for max_fail_percentage 30582 1726855389.58934: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.58935: done checking to see if all hosts have failed 30582 1726855389.58935: getting the remaining hosts for this loop 30582 1726855389.58937: done getting the remaining hosts for this loop 30582 1726855389.58940: getting the next task for host managed_node3 30582 1726855389.58947: done getting next task for host managed_node3 30582 1726855389.58951: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855389.58955: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.58984: getting variables 30582 1726855389.58985: in VariableManager get_vars() 30582 1726855389.59024: Calling all_inventory to load vars for managed_node3 30582 1726855389.59027: Calling groups_inventory to load vars for managed_node3 30582 1726855389.59028: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.59037: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.59039: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.59042: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.59966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.60842: done with get_vars() 30582 1726855389.60859: done getting variables 30582 1726855389.60905: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:03:09 -0400 (0:00:00.065) 0:02:05.959 ****** 30582 1726855389.60929: entering _queue_task() for managed_node3/dnf 30582 1726855389.61181: worker is 1 (out of 1 available) 30582 1726855389.61198: exiting _queue_task() for managed_node3/dnf 30582 1726855389.61210: done queuing things up, now waiting for results queue to drain 30582 1726855389.61212: waiting for pending results... 30582 1726855389.61390: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30582 1726855389.61483: in run() - task 0affcc66-ac2b-aa83-7d57-000000002699 30582 1726855389.61493: variable 'ansible_search_path' from source: unknown 30582 1726855389.61496: variable 'ansible_search_path' from source: unknown 30582 1726855389.61525: calling self._execute() 30582 1726855389.61597: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.61601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.61610: variable 'omit' from source: magic vars 30582 1726855389.61891: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.61901: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.62034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855389.63552: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855389.63614: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855389.63642: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855389.63669: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855389.63689: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855389.63750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.63771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.63790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.63816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.63831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.63917: variable 'ansible_distribution' from source: facts 30582 1726855389.63921: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.63936: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30582 1726855389.64017: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855389.64106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.64123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.64140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.64265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.64271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.64273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.64276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.64278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.64280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.64282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.64291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.64307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.64323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.64347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.64357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.64469: variable 'network_connections' from source: include params 30582 1726855389.64489: variable 'interface' from source: play vars 30582 1726855389.64532: variable 'interface' from source: play vars 30582 1726855389.64585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855389.64708: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855389.64737: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855389.64760: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855389.64785: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855389.64819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855389.64837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855389.64858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.64878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855389.64916: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855389.65085: variable 'network_connections' from source: include params 30582 1726855389.65090: variable 'interface' from source: play vars 30582 1726855389.65136: variable 'interface' from source: play vars 30582 1726855389.65155: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855389.65159: when evaluation is False, skipping this task 30582 1726855389.65161: _execute() done 30582 1726855389.65163: dumping result to json 30582 1726855389.65170: done dumping result, returning 30582 1726855389.65178: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-000000002699] 30582 1726855389.65182: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002699 30582 1726855389.65275: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002699 30582 1726855389.65278: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855389.65329: no more pending results, returning what we have 30582 1726855389.65332: results queue empty 30582 1726855389.65333: checking for any_errors_fatal 30582 1726855389.65340: done checking for any_errors_fatal 30582 1726855389.65341: checking for max_fail_percentage 30582 1726855389.65343: done checking for max_fail_percentage 30582 1726855389.65344: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.65345: done checking to see if all hosts have failed 30582 1726855389.65345: getting the remaining hosts for this loop 30582 1726855389.65347: done getting the remaining hosts for this loop 30582 1726855389.65351: getting the next task for host managed_node3 30582 1726855389.65360: done getting next task for host managed_node3 30582 1726855389.65363: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855389.65368: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.65403: getting variables 30582 1726855389.65405: in VariableManager get_vars() 30582 1726855389.65450: Calling all_inventory to load vars for managed_node3 30582 1726855389.65454: Calling groups_inventory to load vars for managed_node3 30582 1726855389.65456: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.65465: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.65468: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.65470: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.66316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.71974: done with get_vars() 30582 1726855389.72003: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30582 1726855389.72050: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:03:09 -0400 (0:00:00.111) 0:02:06.070 ****** 30582 1726855389.72073: entering _queue_task() for managed_node3/yum 30582 1726855389.72368: worker is 1 (out of 1 available) 30582 1726855389.72386: exiting _queue_task() for managed_node3/yum 30582 1726855389.72400: done queuing things up, now waiting for results queue to drain 30582 1726855389.72402: waiting for pending results... 30582 1726855389.72602: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30582 1726855389.72716: in run() - task 0affcc66-ac2b-aa83-7d57-00000000269a 30582 1726855389.72728: variable 'ansible_search_path' from source: unknown 30582 1726855389.72735: variable 'ansible_search_path' from source: unknown 30582 1726855389.72762: calling self._execute() 30582 1726855389.72838: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.72842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.72853: variable 'omit' from source: magic vars 30582 1726855389.73151: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.73161: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.73286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855389.74841: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855389.74894: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855389.74923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855389.74949: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855389.74970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855389.75033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.75052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.75071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.75099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.75110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.75183: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.75198: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30582 1726855389.75201: when evaluation is False, skipping this task 30582 1726855389.75204: _execute() done 30582 1726855389.75206: dumping result to json 30582 1726855389.75208: done dumping result, returning 30582 1726855389.75216: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000269a] 30582 1726855389.75219: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269a 30582 1726855389.75318: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269a 30582 1726855389.75321: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30582 1726855389.75375: no more pending results, returning what we have 30582 1726855389.75379: results queue empty 30582 1726855389.75380: checking for any_errors_fatal 30582 1726855389.75388: done checking for any_errors_fatal 30582 1726855389.75389: checking for max_fail_percentage 30582 1726855389.75391: done checking for max_fail_percentage 30582 1726855389.75392: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.75393: done checking to see if all hosts have failed 30582 1726855389.75394: getting the remaining hosts for this loop 30582 1726855389.75395: done getting the remaining hosts for this loop 30582 1726855389.75398: getting the next task for host managed_node3 30582 1726855389.75408: done getting next task for host managed_node3 30582 1726855389.75412: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855389.75416: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.75450: getting variables 30582 1726855389.75452: in VariableManager get_vars() 30582 1726855389.75505: Calling all_inventory to load vars for managed_node3 30582 1726855389.75508: Calling groups_inventory to load vars for managed_node3 30582 1726855389.75510: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.75521: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.75523: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.75525: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.76353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.77244: done with get_vars() 30582 1726855389.77262: done getting variables 30582 1726855389.77309: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:03:09 -0400 (0:00:00.052) 0:02:06.123 ****** 30582 1726855389.77338: entering _queue_task() for managed_node3/fail 30582 1726855389.77589: worker is 1 (out of 1 available) 30582 1726855389.77600: exiting _queue_task() for managed_node3/fail 30582 1726855389.77609: done queuing things up, now waiting for results queue to drain 30582 1726855389.77610: waiting for pending results... 30582 1726855389.77862: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30582 1726855389.77973: in run() - task 0affcc66-ac2b-aa83-7d57-00000000269b 30582 1726855389.77985: variable 'ansible_search_path' from source: unknown 30582 1726855389.77990: variable 'ansible_search_path' from source: unknown 30582 1726855389.78018: calling self._execute() 30582 1726855389.78164: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.78168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.78172: variable 'omit' from source: magic vars 30582 1726855389.78409: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.78419: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.78509: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855389.78639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855389.80155: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855389.80210: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855389.80238: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855389.80265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855389.80290: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855389.80346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.80368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.80390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.80417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.80428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.80462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.80484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.80502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.80527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.80539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.80569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.80585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.80603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.80627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.80638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.80759: variable 'network_connections' from source: include params 30582 1726855389.80768: variable 'interface' from source: play vars 30582 1726855389.80815: variable 'interface' from source: play vars 30582 1726855389.80865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855389.80971: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855389.81282: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855389.81307: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855389.81327: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855389.81359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855389.81377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855389.81397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.81414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855389.81453: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855389.81600: variable 'network_connections' from source: include params 30582 1726855389.81603: variable 'interface' from source: play vars 30582 1726855389.81646: variable 'interface' from source: play vars 30582 1726855389.81666: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855389.81670: when evaluation is False, skipping this task 30582 1726855389.81672: _execute() done 30582 1726855389.81677: dumping result to json 30582 1726855389.81679: done dumping result, returning 30582 1726855389.81682: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000269b] 30582 1726855389.81689: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269b 30582 1726855389.81782: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269b 30582 1726855389.81785: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855389.81833: no more pending results, returning what we have 30582 1726855389.81836: results queue empty 30582 1726855389.81837: checking for any_errors_fatal 30582 1726855389.81843: done checking for any_errors_fatal 30582 1726855389.81843: checking for max_fail_percentage 30582 1726855389.81845: done checking for max_fail_percentage 30582 1726855389.81852: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.81853: done checking to see if all hosts have failed 30582 1726855389.81853: getting the remaining hosts for this loop 30582 1726855389.81855: done getting the remaining hosts for this loop 30582 1726855389.81858: getting the next task for host managed_node3 30582 1726855389.81868: done getting next task for host managed_node3 30582 1726855389.81872: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30582 1726855389.81877: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.81908: getting variables 30582 1726855389.81910: in VariableManager get_vars() 30582 1726855389.81951: Calling all_inventory to load vars for managed_node3 30582 1726855389.81953: Calling groups_inventory to load vars for managed_node3 30582 1726855389.81955: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.81968: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.81970: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.81973: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.82975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.83844: done with get_vars() 30582 1726855389.83865: done getting variables 30582 1726855389.83909: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:03:09 -0400 (0:00:00.065) 0:02:06.189 ****** 30582 1726855389.83935: entering _queue_task() for managed_node3/package 30582 1726855389.84204: worker is 1 (out of 1 available) 30582 1726855389.84217: exiting _queue_task() for managed_node3/package 30582 1726855389.84231: done queuing things up, now waiting for results queue to drain 30582 1726855389.84232: waiting for pending results... 30582 1726855389.84419: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 30582 1726855389.84521: in run() - task 0affcc66-ac2b-aa83-7d57-00000000269c 30582 1726855389.84532: variable 'ansible_search_path' from source: unknown 30582 1726855389.84535: variable 'ansible_search_path' from source: unknown 30582 1726855389.84567: calling self._execute() 30582 1726855389.84638: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.84642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.84650: variable 'omit' from source: magic vars 30582 1726855389.84940: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.84950: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.85083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855389.85274: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855389.85313: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855389.85341: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855389.85393: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855389.85479: variable 'network_packages' from source: role '' defaults 30582 1726855389.85551: variable '__network_provider_setup' from source: role '' defaults 30582 1726855389.85560: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855389.85607: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855389.85615: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855389.85661: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855389.85772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855389.87110: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855389.87159: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855389.87192: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855389.87222: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855389.87243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855389.87318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.87339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.87357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.87390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.87402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.87435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.87452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.87470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.87498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.87510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.87658: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855389.87732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.87750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.87768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.87794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.87805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.87870: variable 'ansible_python' from source: facts 30582 1726855389.87883: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855389.87941: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855389.87999: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855389.88081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.88099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.88116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.88140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.88150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.88188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855389.88208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855389.88224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.88247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855389.88259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855389.88358: variable 'network_connections' from source: include params 30582 1726855389.88364: variable 'interface' from source: play vars 30582 1726855389.88438: variable 'interface' from source: play vars 30582 1726855389.88496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855389.88512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855389.88533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855389.88553: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855389.88593: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855389.88774: variable 'network_connections' from source: include params 30582 1726855389.88777: variable 'interface' from source: play vars 30582 1726855389.88850: variable 'interface' from source: play vars 30582 1726855389.88876: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855389.88932: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855389.89122: variable 'network_connections' from source: include params 30582 1726855389.89125: variable 'interface' from source: play vars 30582 1726855389.89173: variable 'interface' from source: play vars 30582 1726855389.89191: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855389.89243: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855389.89438: variable 'network_connections' from source: include params 30582 1726855389.89442: variable 'interface' from source: play vars 30582 1726855389.89491: variable 'interface' from source: play vars 30582 1726855389.89527: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855389.89570: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855389.89578: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855389.89621: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855389.89753: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855389.90055: variable 'network_connections' from source: include params 30582 1726855389.90058: variable 'interface' from source: play vars 30582 1726855389.90103: variable 'interface' from source: play vars 30582 1726855389.90110: variable 'ansible_distribution' from source: facts 30582 1726855389.90113: variable '__network_rh_distros' from source: role '' defaults 30582 1726855389.90119: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.90132: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855389.90239: variable 'ansible_distribution' from source: facts 30582 1726855389.90242: variable '__network_rh_distros' from source: role '' defaults 30582 1726855389.90246: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.90259: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855389.90366: variable 'ansible_distribution' from source: facts 30582 1726855389.90373: variable '__network_rh_distros' from source: role '' defaults 30582 1726855389.90378: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.90406: variable 'network_provider' from source: set_fact 30582 1726855389.90417: variable 'ansible_facts' from source: unknown 30582 1726855389.90885: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30582 1726855389.90890: when evaluation is False, skipping this task 30582 1726855389.90893: _execute() done 30582 1726855389.90895: dumping result to json 30582 1726855389.90897: done dumping result, returning 30582 1726855389.90905: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-aa83-7d57-00000000269c] 30582 1726855389.90908: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269c 30582 1726855389.91001: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269c 30582 1726855389.91003: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30582 1726855389.91056: no more pending results, returning what we have 30582 1726855389.91060: results queue empty 30582 1726855389.91061: checking for any_errors_fatal 30582 1726855389.91069: done checking for any_errors_fatal 30582 1726855389.91070: checking for max_fail_percentage 30582 1726855389.91072: done checking for max_fail_percentage 30582 1726855389.91073: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.91073: done checking to see if all hosts have failed 30582 1726855389.91074: getting the remaining hosts for this loop 30582 1726855389.91075: done getting the remaining hosts for this loop 30582 1726855389.91079: getting the next task for host managed_node3 30582 1726855389.91090: done getting next task for host managed_node3 30582 1726855389.91095: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855389.91099: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.91131: getting variables 30582 1726855389.91132: in VariableManager get_vars() 30582 1726855389.91182: Calling all_inventory to load vars for managed_node3 30582 1726855389.91185: Calling groups_inventory to load vars for managed_node3 30582 1726855389.91191: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.91202: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.91205: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.91207: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.92168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.93628: done with get_vars() 30582 1726855389.93651: done getting variables 30582 1726855389.93699: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:03:09 -0400 (0:00:00.097) 0:02:06.287 ****** 30582 1726855389.93726: entering _queue_task() for managed_node3/package 30582 1726855389.93992: worker is 1 (out of 1 available) 30582 1726855389.94006: exiting _queue_task() for managed_node3/package 30582 1726855389.94018: done queuing things up, now waiting for results queue to drain 30582 1726855389.94020: waiting for pending results... 30582 1726855389.94212: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30582 1726855389.94317: in run() - task 0affcc66-ac2b-aa83-7d57-00000000269d 30582 1726855389.94328: variable 'ansible_search_path' from source: unknown 30582 1726855389.94332: variable 'ansible_search_path' from source: unknown 30582 1726855389.94363: calling self._execute() 30582 1726855389.94450: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.94454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.94469: variable 'omit' from source: magic vars 30582 1726855389.94758: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.94768: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.94861: variable 'network_state' from source: role '' defaults 30582 1726855389.94871: Evaluated conditional (network_state != {}): False 30582 1726855389.94874: when evaluation is False, skipping this task 30582 1726855389.94877: _execute() done 30582 1726855389.94879: dumping result to json 30582 1726855389.94882: done dumping result, returning 30582 1726855389.94894: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-00000000269d] 30582 1726855389.94899: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269d 30582 1726855389.94995: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269d 30582 1726855389.94999: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855389.95046: no more pending results, returning what we have 30582 1726855389.95049: results queue empty 30582 1726855389.95052: checking for any_errors_fatal 30582 1726855389.95058: done checking for any_errors_fatal 30582 1726855389.95059: checking for max_fail_percentage 30582 1726855389.95061: done checking for max_fail_percentage 30582 1726855389.95062: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.95064: done checking to see if all hosts have failed 30582 1726855389.95065: getting the remaining hosts for this loop 30582 1726855389.95067: done getting the remaining hosts for this loop 30582 1726855389.95070: getting the next task for host managed_node3 30582 1726855389.95079: done getting next task for host managed_node3 30582 1726855389.95083: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855389.95091: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.95125: getting variables 30582 1726855389.95127: in VariableManager get_vars() 30582 1726855389.95175: Calling all_inventory to load vars for managed_node3 30582 1726855389.95178: Calling groups_inventory to load vars for managed_node3 30582 1726855389.95180: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.95196: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.95199: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.95202: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.96172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855389.97039: done with get_vars() 30582 1726855389.97056: done getting variables 30582 1726855389.97104: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:03:09 -0400 (0:00:00.034) 0:02:06.321 ****** 30582 1726855389.97130: entering _queue_task() for managed_node3/package 30582 1726855389.97386: worker is 1 (out of 1 available) 30582 1726855389.97402: exiting _queue_task() for managed_node3/package 30582 1726855389.97414: done queuing things up, now waiting for results queue to drain 30582 1726855389.97416: waiting for pending results... 30582 1726855389.97599: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30582 1726855389.97701: in run() - task 0affcc66-ac2b-aa83-7d57-00000000269e 30582 1726855389.97712: variable 'ansible_search_path' from source: unknown 30582 1726855389.97717: variable 'ansible_search_path' from source: unknown 30582 1726855389.97745: calling self._execute() 30582 1726855389.97819: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855389.97823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855389.97833: variable 'omit' from source: magic vars 30582 1726855389.98117: variable 'ansible_distribution_major_version' from source: facts 30582 1726855389.98126: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855389.98217: variable 'network_state' from source: role '' defaults 30582 1726855389.98225: Evaluated conditional (network_state != {}): False 30582 1726855389.98228: when evaluation is False, skipping this task 30582 1726855389.98231: _execute() done 30582 1726855389.98233: dumping result to json 30582 1726855389.98236: done dumping result, returning 30582 1726855389.98244: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-aa83-7d57-00000000269e] 30582 1726855389.98249: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269e 30582 1726855389.98343: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269e 30582 1726855389.98346: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855389.98395: no more pending results, returning what we have 30582 1726855389.98399: results queue empty 30582 1726855389.98400: checking for any_errors_fatal 30582 1726855389.98408: done checking for any_errors_fatal 30582 1726855389.98409: checking for max_fail_percentage 30582 1726855389.98411: done checking for max_fail_percentage 30582 1726855389.98412: checking to see if all hosts have failed and the running result is not ok 30582 1726855389.98412: done checking to see if all hosts have failed 30582 1726855389.98413: getting the remaining hosts for this loop 30582 1726855389.98414: done getting the remaining hosts for this loop 30582 1726855389.98418: getting the next task for host managed_node3 30582 1726855389.98426: done getting next task for host managed_node3 30582 1726855389.98430: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855389.98435: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855389.98461: getting variables 30582 1726855389.98466: in VariableManager get_vars() 30582 1726855389.98506: Calling all_inventory to load vars for managed_node3 30582 1726855389.98509: Calling groups_inventory to load vars for managed_node3 30582 1726855389.98511: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855389.98519: Calling all_plugins_play to load vars for managed_node3 30582 1726855389.98522: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855389.98524: Calling groups_plugins_play to load vars for managed_node3 30582 1726855389.99302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855390.00198: done with get_vars() 30582 1726855390.00215: done getting variables 30582 1726855390.00258: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:03:10 -0400 (0:00:00.031) 0:02:06.352 ****** 30582 1726855390.00285: entering _queue_task() for managed_node3/service 30582 1726855390.00531: worker is 1 (out of 1 available) 30582 1726855390.00546: exiting _queue_task() for managed_node3/service 30582 1726855390.00559: done queuing things up, now waiting for results queue to drain 30582 1726855390.00561: waiting for pending results... 30582 1726855390.00745: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30582 1726855390.00842: in run() - task 0affcc66-ac2b-aa83-7d57-00000000269f 30582 1726855390.00854: variable 'ansible_search_path' from source: unknown 30582 1726855390.00857: variable 'ansible_search_path' from source: unknown 30582 1726855390.00886: calling self._execute() 30582 1726855390.00964: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.00969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.00975: variable 'omit' from source: magic vars 30582 1726855390.01255: variable 'ansible_distribution_major_version' from source: facts 30582 1726855390.01267: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855390.01357: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855390.01489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855390.03201: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855390.03254: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855390.03281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855390.03310: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855390.03330: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855390.03391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.03413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.03430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.03456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.03466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.03510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.03523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.03539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.03567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.03576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.03607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.03626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.03642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.03668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.03677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.03793: variable 'network_connections' from source: include params 30582 1726855390.03803: variable 'interface' from source: play vars 30582 1726855390.03854: variable 'interface' from source: play vars 30582 1726855390.03906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855390.04012: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855390.04039: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855390.04074: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855390.04097: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855390.04127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855390.04142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855390.04166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.04180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855390.04221: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855390.04371: variable 'network_connections' from source: include params 30582 1726855390.04377: variable 'interface' from source: play vars 30582 1726855390.04421: variable 'interface' from source: play vars 30582 1726855390.04439: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30582 1726855390.04442: when evaluation is False, skipping this task 30582 1726855390.04445: _execute() done 30582 1726855390.04448: dumping result to json 30582 1726855390.04450: done dumping result, returning 30582 1726855390.04457: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-aa83-7d57-00000000269f] 30582 1726855390.04462: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269f 30582 1726855390.04551: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000269f 30582 1726855390.04561: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30582 1726855390.04607: no more pending results, returning what we have 30582 1726855390.04611: results queue empty 30582 1726855390.04612: checking for any_errors_fatal 30582 1726855390.04619: done checking for any_errors_fatal 30582 1726855390.04620: checking for max_fail_percentage 30582 1726855390.04621: done checking for max_fail_percentage 30582 1726855390.04622: checking to see if all hosts have failed and the running result is not ok 30582 1726855390.04623: done checking to see if all hosts have failed 30582 1726855390.04624: getting the remaining hosts for this loop 30582 1726855390.04625: done getting the remaining hosts for this loop 30582 1726855390.04629: getting the next task for host managed_node3 30582 1726855390.04637: done getting next task for host managed_node3 30582 1726855390.04640: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855390.04644: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855390.04679: getting variables 30582 1726855390.04680: in VariableManager get_vars() 30582 1726855390.04731: Calling all_inventory to load vars for managed_node3 30582 1726855390.04734: Calling groups_inventory to load vars for managed_node3 30582 1726855390.04736: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855390.04746: Calling all_plugins_play to load vars for managed_node3 30582 1726855390.04748: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855390.04751: Calling groups_plugins_play to load vars for managed_node3 30582 1726855390.05703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855390.06577: done with get_vars() 30582 1726855390.06597: done getting variables 30582 1726855390.06641: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:03:10 -0400 (0:00:00.063) 0:02:06.416 ****** 30582 1726855390.06669: entering _queue_task() for managed_node3/service 30582 1726855390.06943: worker is 1 (out of 1 available) 30582 1726855390.06956: exiting _queue_task() for managed_node3/service 30582 1726855390.06970: done queuing things up, now waiting for results queue to drain 30582 1726855390.06972: waiting for pending results... 30582 1726855390.07158: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30582 1726855390.07246: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a0 30582 1726855390.07258: variable 'ansible_search_path' from source: unknown 30582 1726855390.07261: variable 'ansible_search_path' from source: unknown 30582 1726855390.07295: calling self._execute() 30582 1726855390.07371: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.07376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.07383: variable 'omit' from source: magic vars 30582 1726855390.07670: variable 'ansible_distribution_major_version' from source: facts 30582 1726855390.07680: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855390.07796: variable 'network_provider' from source: set_fact 30582 1726855390.07799: variable 'network_state' from source: role '' defaults 30582 1726855390.07808: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30582 1726855390.07814: variable 'omit' from source: magic vars 30582 1726855390.07865: variable 'omit' from source: magic vars 30582 1726855390.07882: variable 'network_service_name' from source: role '' defaults 30582 1726855390.07932: variable 'network_service_name' from source: role '' defaults 30582 1726855390.08007: variable '__network_provider_setup' from source: role '' defaults 30582 1726855390.08012: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855390.08056: variable '__network_service_name_default_nm' from source: role '' defaults 30582 1726855390.08063: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855390.08115: variable '__network_packages_default_nm' from source: role '' defaults 30582 1726855390.08264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855390.09736: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855390.09792: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855390.09822: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855390.09847: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855390.09870: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855390.09931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.09951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.09971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.09999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.10009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.10043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.10058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.10078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.10104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.10114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.10272: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30582 1726855390.10350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.10364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.10383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.10409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.10419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.10484: variable 'ansible_python' from source: facts 30582 1726855390.10497: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30582 1726855390.10551: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855390.10609: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855390.10694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.10712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.10728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.10751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.10762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.10800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.10820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.10836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.10860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.10873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.10965: variable 'network_connections' from source: include params 30582 1726855390.10975: variable 'interface' from source: play vars 30582 1726855390.11030: variable 'interface' from source: play vars 30582 1726855390.11106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855390.11237: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855390.11274: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855390.11308: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855390.11340: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855390.11389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855390.11410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855390.11433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.11459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855390.11500: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855390.11680: variable 'network_connections' from source: include params 30582 1726855390.11685: variable 'interface' from source: play vars 30582 1726855390.11738: variable 'interface' from source: play vars 30582 1726855390.11763: variable '__network_packages_default_wireless' from source: role '' defaults 30582 1726855390.11821: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855390.12010: variable 'network_connections' from source: include params 30582 1726855390.12013: variable 'interface' from source: play vars 30582 1726855390.12062: variable 'interface' from source: play vars 30582 1726855390.12080: variable '__network_packages_default_team' from source: role '' defaults 30582 1726855390.12137: variable '__network_team_connections_defined' from source: role '' defaults 30582 1726855390.12326: variable 'network_connections' from source: include params 30582 1726855390.12329: variable 'interface' from source: play vars 30582 1726855390.12380: variable 'interface' from source: play vars 30582 1726855390.12421: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855390.12459: variable '__network_service_name_default_initscripts' from source: role '' defaults 30582 1726855390.12466: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855390.12511: variable '__network_packages_default_initscripts' from source: role '' defaults 30582 1726855390.12644: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30582 1726855390.13128: variable 'network_connections' from source: include params 30582 1726855390.13131: variable 'interface' from source: play vars 30582 1726855390.13177: variable 'interface' from source: play vars 30582 1726855390.13185: variable 'ansible_distribution' from source: facts 30582 1726855390.13190: variable '__network_rh_distros' from source: role '' defaults 30582 1726855390.13192: variable 'ansible_distribution_major_version' from source: facts 30582 1726855390.13204: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30582 1726855390.13316: variable 'ansible_distribution' from source: facts 30582 1726855390.13320: variable '__network_rh_distros' from source: role '' defaults 30582 1726855390.13325: variable 'ansible_distribution_major_version' from source: facts 30582 1726855390.13336: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30582 1726855390.13447: variable 'ansible_distribution' from source: facts 30582 1726855390.13451: variable '__network_rh_distros' from source: role '' defaults 30582 1726855390.13456: variable 'ansible_distribution_major_version' from source: facts 30582 1726855390.13484: variable 'network_provider' from source: set_fact 30582 1726855390.13502: variable 'omit' from source: magic vars 30582 1726855390.13526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855390.13547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855390.13562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855390.13578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855390.13586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855390.13611: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855390.13615: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.13618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.13690: Set connection var ansible_timeout to 10 30582 1726855390.13693: Set connection var ansible_connection to ssh 30582 1726855390.13698: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855390.13703: Set connection var ansible_pipelining to False 30582 1726855390.13708: Set connection var ansible_shell_executable to /bin/sh 30582 1726855390.13710: Set connection var ansible_shell_type to sh 30582 1726855390.13734: variable 'ansible_shell_executable' from source: unknown 30582 1726855390.13737: variable 'ansible_connection' from source: unknown 30582 1726855390.13739: variable 'ansible_module_compression' from source: unknown 30582 1726855390.13741: variable 'ansible_shell_type' from source: unknown 30582 1726855390.13743: variable 'ansible_shell_executable' from source: unknown 30582 1726855390.13745: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.13747: variable 'ansible_pipelining' from source: unknown 30582 1726855390.13749: variable 'ansible_timeout' from source: unknown 30582 1726855390.13751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.13825: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855390.13838: variable 'omit' from source: magic vars 30582 1726855390.13841: starting attempt loop 30582 1726855390.13849: running the handler 30582 1726855390.13903: variable 'ansible_facts' from source: unknown 30582 1726855390.14362: _low_level_execute_command(): starting 30582 1726855390.14372: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855390.14876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.14880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.14882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855390.14885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.14938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.14943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855390.14945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.15016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.16703: stdout chunk (state=3): >>>/root <<< 30582 1726855390.16799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855390.16830: stderr chunk (state=3): >>><<< 30582 1726855390.16833: stdout chunk (state=3): >>><<< 30582 1726855390.16852: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855390.16864: _low_level_execute_command(): starting 30582 1726855390.16868: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381 `" && echo ansible-tmp-1726855390.1685233-36027-22448378761381="` echo /root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381 `" ) && sleep 0' 30582 1726855390.17319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.17322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855390.17325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855390.17328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855390.17330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.17382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.17385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855390.17390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.17454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.19357: stdout chunk (state=3): >>>ansible-tmp-1726855390.1685233-36027-22448378761381=/root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381 <<< 30582 1726855390.19467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855390.19497: stderr chunk (state=3): >>><<< 30582 1726855390.19501: stdout chunk (state=3): >>><<< 30582 1726855390.19514: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855390.1685233-36027-22448378761381=/root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855390.19540: variable 'ansible_module_compression' from source: unknown 30582 1726855390.19584: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30582 1726855390.19640: variable 'ansible_facts' from source: unknown 30582 1726855390.19779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/AnsiballZ_systemd.py 30582 1726855390.19888: Sending initial data 30582 1726855390.19892: Sent initial data (155 bytes) 30582 1726855390.20341: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855390.20344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855390.20352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.20354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.20356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.20405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.20409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.20476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.22026: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855390.22030: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855390.22085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855390.22144: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpk7w9z4qa /root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/AnsiballZ_systemd.py <<< 30582 1726855390.22148: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/AnsiballZ_systemd.py" <<< 30582 1726855390.22204: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpk7w9z4qa" to remote "/root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/AnsiballZ_systemd.py" <<< 30582 1726855390.22208: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/AnsiballZ_systemd.py" <<< 30582 1726855390.23330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855390.23374: stderr chunk (state=3): >>><<< 30582 1726855390.23378: stdout chunk (state=3): >>><<< 30582 1726855390.23405: done transferring module to remote 30582 1726855390.23414: _low_level_execute_command(): starting 30582 1726855390.23418: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/ /root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/AnsiballZ_systemd.py && sleep 0' 30582 1726855390.23862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.23868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.23870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855390.23872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855390.23874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.23924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.23932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855390.23938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.23988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.25768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855390.25792: stderr chunk (state=3): >>><<< 30582 1726855390.25795: stdout chunk (state=3): >>><<< 30582 1726855390.25808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855390.25810: _low_level_execute_command(): starting 30582 1726855390.25816: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/AnsiballZ_systemd.py && sleep 0' 30582 1726855390.26249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855390.26254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.26256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855390.26258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855390.26260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.26314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.26317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855390.26322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.26383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.55324: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10629120", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306086400", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2355628000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 30582 1726855390.55338: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.s<<< 30582 1726855390.55347: stdout chunk (state=3): >>>ocket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30582 1726855390.57175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855390.57206: stderr chunk (state=3): >>><<< 30582 1726855390.57209: stdout chunk (state=3): >>><<< 30582 1726855390.57228: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10629120", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306086400", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "2355628000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855390.57348: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855390.57365: _low_level_execute_command(): starting 30582 1726855390.57372: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855390.1685233-36027-22448378761381/ > /dev/null 2>&1 && sleep 0' 30582 1726855390.57817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855390.57820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855390.57822: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.57824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.57826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.57879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.57884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.57948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.59760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855390.59793: stderr chunk (state=3): >>><<< 30582 1726855390.59796: stdout chunk (state=3): >>><<< 30582 1726855390.59810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855390.59816: handler run complete 30582 1726855390.59855: attempt loop complete, returning result 30582 1726855390.59858: _execute() done 30582 1726855390.59860: dumping result to json 30582 1726855390.59877: done dumping result, returning 30582 1726855390.59885: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-aa83-7d57-0000000026a0] 30582 1726855390.59891: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a0 30582 1726855390.60138: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a0 30582 1726855390.60141: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855390.60212: no more pending results, returning what we have 30582 1726855390.60215: results queue empty 30582 1726855390.60216: checking for any_errors_fatal 30582 1726855390.60221: done checking for any_errors_fatal 30582 1726855390.60222: checking for max_fail_percentage 30582 1726855390.60223: done checking for max_fail_percentage 30582 1726855390.60224: checking to see if all hosts have failed and the running result is not ok 30582 1726855390.60225: done checking to see if all hosts have failed 30582 1726855390.60226: getting the remaining hosts for this loop 30582 1726855390.60227: done getting the remaining hosts for this loop 30582 1726855390.60231: getting the next task for host managed_node3 30582 1726855390.60238: done getting next task for host managed_node3 30582 1726855390.60242: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855390.60247: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855390.60264: getting variables 30582 1726855390.60266: in VariableManager get_vars() 30582 1726855390.60306: Calling all_inventory to load vars for managed_node3 30582 1726855390.60309: Calling groups_inventory to load vars for managed_node3 30582 1726855390.60311: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855390.60322: Calling all_plugins_play to load vars for managed_node3 30582 1726855390.60325: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855390.60327: Calling groups_plugins_play to load vars for managed_node3 30582 1726855390.61266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855390.62148: done with get_vars() 30582 1726855390.62170: done getting variables 30582 1726855390.62217: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:03:10 -0400 (0:00:00.555) 0:02:06.972 ****** 30582 1726855390.62249: entering _queue_task() for managed_node3/service 30582 1726855390.62528: worker is 1 (out of 1 available) 30582 1726855390.62544: exiting _queue_task() for managed_node3/service 30582 1726855390.62557: done queuing things up, now waiting for results queue to drain 30582 1726855390.62559: waiting for pending results... 30582 1726855390.62751: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30582 1726855390.62845: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a1 30582 1726855390.62856: variable 'ansible_search_path' from source: unknown 30582 1726855390.62859: variable 'ansible_search_path' from source: unknown 30582 1726855390.62892: calling self._execute() 30582 1726855390.62994: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.62997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.63001: variable 'omit' from source: magic vars 30582 1726855390.63273: variable 'ansible_distribution_major_version' from source: facts 30582 1726855390.63283: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855390.63367: variable 'network_provider' from source: set_fact 30582 1726855390.63370: Evaluated conditional (network_provider == "nm"): True 30582 1726855390.63434: variable '__network_wpa_supplicant_required' from source: role '' defaults 30582 1726855390.63498: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30582 1726855390.63618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855390.65089: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855390.65139: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855390.65169: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855390.65195: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855390.65217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855390.65289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.65311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.65329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.65354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.65367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.65407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.65421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.65438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.65466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.65475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.65504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.65524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.65540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.65567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.65575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.65685: variable 'network_connections' from source: include params 30582 1726855390.65697: variable 'interface' from source: play vars 30582 1726855390.65752: variable 'interface' from source: play vars 30582 1726855390.65805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30582 1726855390.65917: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30582 1726855390.65945: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30582 1726855390.65969: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30582 1726855390.65990: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30582 1726855390.66021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30582 1726855390.66035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30582 1726855390.66052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.66073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30582 1726855390.66113: variable '__network_wireless_connections_defined' from source: role '' defaults 30582 1726855390.66267: variable 'network_connections' from source: include params 30582 1726855390.66271: variable 'interface' from source: play vars 30582 1726855390.66315: variable 'interface' from source: play vars 30582 1726855390.66338: Evaluated conditional (__network_wpa_supplicant_required): False 30582 1726855390.66341: when evaluation is False, skipping this task 30582 1726855390.66343: _execute() done 30582 1726855390.66346: dumping result to json 30582 1726855390.66348: done dumping result, returning 30582 1726855390.66356: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-aa83-7d57-0000000026a1] 30582 1726855390.66370: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a1 30582 1726855390.66456: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a1 30582 1726855390.66459: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30582 1726855390.66539: no more pending results, returning what we have 30582 1726855390.66544: results queue empty 30582 1726855390.66545: checking for any_errors_fatal 30582 1726855390.66575: done checking for any_errors_fatal 30582 1726855390.66575: checking for max_fail_percentage 30582 1726855390.66577: done checking for max_fail_percentage 30582 1726855390.66579: checking to see if all hosts have failed and the running result is not ok 30582 1726855390.66579: done checking to see if all hosts have failed 30582 1726855390.66580: getting the remaining hosts for this loop 30582 1726855390.66582: done getting the remaining hosts for this loop 30582 1726855390.66585: getting the next task for host managed_node3 30582 1726855390.66595: done getting next task for host managed_node3 30582 1726855390.66600: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855390.66605: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855390.66631: getting variables 30582 1726855390.66632: in VariableManager get_vars() 30582 1726855390.66675: Calling all_inventory to load vars for managed_node3 30582 1726855390.66678: Calling groups_inventory to load vars for managed_node3 30582 1726855390.66680: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855390.66693: Calling all_plugins_play to load vars for managed_node3 30582 1726855390.66696: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855390.66699: Calling groups_plugins_play to load vars for managed_node3 30582 1726855390.67499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855390.68383: done with get_vars() 30582 1726855390.68402: done getting variables 30582 1726855390.68445: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:03:10 -0400 (0:00:00.062) 0:02:07.034 ****** 30582 1726855390.68471: entering _queue_task() for managed_node3/service 30582 1726855390.68728: worker is 1 (out of 1 available) 30582 1726855390.68740: exiting _queue_task() for managed_node3/service 30582 1726855390.68752: done queuing things up, now waiting for results queue to drain 30582 1726855390.68754: waiting for pending results... 30582 1726855390.68949: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 30582 1726855390.69047: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a2 30582 1726855390.69058: variable 'ansible_search_path' from source: unknown 30582 1726855390.69061: variable 'ansible_search_path' from source: unknown 30582 1726855390.69098: calling self._execute() 30582 1726855390.69169: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.69173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.69180: variable 'omit' from source: magic vars 30582 1726855390.69469: variable 'ansible_distribution_major_version' from source: facts 30582 1726855390.69478: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855390.69561: variable 'network_provider' from source: set_fact 30582 1726855390.69567: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855390.69570: when evaluation is False, skipping this task 30582 1726855390.69572: _execute() done 30582 1726855390.69575: dumping result to json 30582 1726855390.69577: done dumping result, returning 30582 1726855390.69585: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-aa83-7d57-0000000026a2] 30582 1726855390.69592: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a2 30582 1726855390.69683: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a2 30582 1726855390.69686: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30582 1726855390.69730: no more pending results, returning what we have 30582 1726855390.69734: results queue empty 30582 1726855390.69735: checking for any_errors_fatal 30582 1726855390.69743: done checking for any_errors_fatal 30582 1726855390.69743: checking for max_fail_percentage 30582 1726855390.69745: done checking for max_fail_percentage 30582 1726855390.69746: checking to see if all hosts have failed and the running result is not ok 30582 1726855390.69747: done checking to see if all hosts have failed 30582 1726855390.69748: getting the remaining hosts for this loop 30582 1726855390.69749: done getting the remaining hosts for this loop 30582 1726855390.69753: getting the next task for host managed_node3 30582 1726855390.69762: done getting next task for host managed_node3 30582 1726855390.69768: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855390.69773: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855390.69804: getting variables 30582 1726855390.69806: in VariableManager get_vars() 30582 1726855390.69847: Calling all_inventory to load vars for managed_node3 30582 1726855390.69850: Calling groups_inventory to load vars for managed_node3 30582 1726855390.69853: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855390.69865: Calling all_plugins_play to load vars for managed_node3 30582 1726855390.69868: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855390.69870: Calling groups_plugins_play to load vars for managed_node3 30582 1726855390.70808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855390.71678: done with get_vars() 30582 1726855390.71697: done getting variables 30582 1726855390.71742: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:03:10 -0400 (0:00:00.032) 0:02:07.067 ****** 30582 1726855390.71770: entering _queue_task() for managed_node3/copy 30582 1726855390.72026: worker is 1 (out of 1 available) 30582 1726855390.72041: exiting _queue_task() for managed_node3/copy 30582 1726855390.72053: done queuing things up, now waiting for results queue to drain 30582 1726855390.72055: waiting for pending results... 30582 1726855390.72243: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30582 1726855390.72352: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a3 30582 1726855390.72365: variable 'ansible_search_path' from source: unknown 30582 1726855390.72368: variable 'ansible_search_path' from source: unknown 30582 1726855390.72398: calling self._execute() 30582 1726855390.72469: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.72472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.72479: variable 'omit' from source: magic vars 30582 1726855390.72764: variable 'ansible_distribution_major_version' from source: facts 30582 1726855390.72772: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855390.72856: variable 'network_provider' from source: set_fact 30582 1726855390.72861: Evaluated conditional (network_provider == "initscripts"): False 30582 1726855390.72867: when evaluation is False, skipping this task 30582 1726855390.72870: _execute() done 30582 1726855390.72873: dumping result to json 30582 1726855390.72875: done dumping result, returning 30582 1726855390.72882: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-aa83-7d57-0000000026a3] 30582 1726855390.72888: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a3 30582 1726855390.72983: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a3 30582 1726855390.72986: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30582 1726855390.73034: no more pending results, returning what we have 30582 1726855390.73038: results queue empty 30582 1726855390.73039: checking for any_errors_fatal 30582 1726855390.73045: done checking for any_errors_fatal 30582 1726855390.73046: checking for max_fail_percentage 30582 1726855390.73048: done checking for max_fail_percentage 30582 1726855390.73049: checking to see if all hosts have failed and the running result is not ok 30582 1726855390.73049: done checking to see if all hosts have failed 30582 1726855390.73051: getting the remaining hosts for this loop 30582 1726855390.73052: done getting the remaining hosts for this loop 30582 1726855390.73056: getting the next task for host managed_node3 30582 1726855390.73066: done getting next task for host managed_node3 30582 1726855390.73070: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855390.73075: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855390.73105: getting variables 30582 1726855390.73107: in VariableManager get_vars() 30582 1726855390.73149: Calling all_inventory to load vars for managed_node3 30582 1726855390.73152: Calling groups_inventory to load vars for managed_node3 30582 1726855390.73154: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855390.73165: Calling all_plugins_play to load vars for managed_node3 30582 1726855390.73168: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855390.73170: Calling groups_plugins_play to load vars for managed_node3 30582 1726855390.73976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855390.74986: done with get_vars() 30582 1726855390.75005: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:03:10 -0400 (0:00:00.032) 0:02:07.100 ****** 30582 1726855390.75069: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855390.75320: worker is 1 (out of 1 available) 30582 1726855390.75334: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 30582 1726855390.75346: done queuing things up, now waiting for results queue to drain 30582 1726855390.75348: waiting for pending results... 30582 1726855390.75535: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30582 1726855390.75632: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a4 30582 1726855390.75645: variable 'ansible_search_path' from source: unknown 30582 1726855390.75649: variable 'ansible_search_path' from source: unknown 30582 1726855390.75679: calling self._execute() 30582 1726855390.75751: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.75754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.75765: variable 'omit' from source: magic vars 30582 1726855390.76049: variable 'ansible_distribution_major_version' from source: facts 30582 1726855390.76058: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855390.76066: variable 'omit' from source: magic vars 30582 1726855390.76104: variable 'omit' from source: magic vars 30582 1726855390.76213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855390.77648: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855390.77692: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855390.77723: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855390.77748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855390.77793: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855390.77836: variable 'network_provider' from source: set_fact 30582 1726855390.77929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855390.77948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855390.77970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855390.77998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855390.78009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855390.78062: variable 'omit' from source: magic vars 30582 1726855390.78136: variable 'omit' from source: magic vars 30582 1726855390.78206: variable 'network_connections' from source: include params 30582 1726855390.78216: variable 'interface' from source: play vars 30582 1726855390.78260: variable 'interface' from source: play vars 30582 1726855390.78370: variable 'omit' from source: magic vars 30582 1726855390.78373: variable '__lsr_ansible_managed' from source: task vars 30582 1726855390.78419: variable '__lsr_ansible_managed' from source: task vars 30582 1726855390.78546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30582 1726855390.78681: Loaded config def from plugin (lookup/template) 30582 1726855390.78685: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30582 1726855390.78707: File lookup term: get_ansible_managed.j2 30582 1726855390.78710: variable 'ansible_search_path' from source: unknown 30582 1726855390.78713: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30582 1726855390.78725: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30582 1726855390.78740: variable 'ansible_search_path' from source: unknown 30582 1726855390.82166: variable 'ansible_managed' from source: unknown 30582 1726855390.82244: variable 'omit' from source: magic vars 30582 1726855390.82268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855390.82288: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855390.82303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855390.82319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855390.82327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855390.82349: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855390.82352: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.82355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.82491: Set connection var ansible_timeout to 10 30582 1726855390.82494: Set connection var ansible_connection to ssh 30582 1726855390.82495: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855390.82496: Set connection var ansible_pipelining to False 30582 1726855390.82497: Set connection var ansible_shell_executable to /bin/sh 30582 1726855390.82499: Set connection var ansible_shell_type to sh 30582 1726855390.82500: variable 'ansible_shell_executable' from source: unknown 30582 1726855390.82501: variable 'ansible_connection' from source: unknown 30582 1726855390.82502: variable 'ansible_module_compression' from source: unknown 30582 1726855390.82503: variable 'ansible_shell_type' from source: unknown 30582 1726855390.82504: variable 'ansible_shell_executable' from source: unknown 30582 1726855390.82506: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855390.82507: variable 'ansible_pipelining' from source: unknown 30582 1726855390.82508: variable 'ansible_timeout' from source: unknown 30582 1726855390.82509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855390.82577: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855390.82592: variable 'omit' from source: magic vars 30582 1726855390.82595: starting attempt loop 30582 1726855390.82597: running the handler 30582 1726855390.82608: _low_level_execute_command(): starting 30582 1726855390.82614: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855390.83114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.83118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.83120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.83122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.83169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.83172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855390.83183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.83258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.84932: stdout chunk (state=3): >>>/root <<< 30582 1726855390.85031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855390.85062: stderr chunk (state=3): >>><<< 30582 1726855390.85065: stdout chunk (state=3): >>><<< 30582 1726855390.85089: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855390.85099: _low_level_execute_command(): starting 30582 1726855390.85106: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811 `" && echo ansible-tmp-1726855390.8508897-36038-194049096583811="` echo /root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811 `" ) && sleep 0' 30582 1726855390.85551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855390.85554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855390.85557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.85559: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.85561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.85614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.85618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855390.85620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.85681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.87586: stdout chunk (state=3): >>>ansible-tmp-1726855390.8508897-36038-194049096583811=/root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811 <<< 30582 1726855390.87691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855390.87721: stderr chunk (state=3): >>><<< 30582 1726855390.87725: stdout chunk (state=3): >>><<< 30582 1726855390.87740: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855390.8508897-36038-194049096583811=/root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855390.87779: variable 'ansible_module_compression' from source: unknown 30582 1726855390.87818: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30582 1726855390.87845: variable 'ansible_facts' from source: unknown 30582 1726855390.87912: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/AnsiballZ_network_connections.py 30582 1726855390.88011: Sending initial data 30582 1726855390.88014: Sent initial data (168 bytes) 30582 1726855390.88458: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855390.88461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855390.88470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.88472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.88474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.88512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.88526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.88586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.90131: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30582 1726855390.90136: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855390.90186: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855390.90245: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpn1pgt76i /root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/AnsiballZ_network_connections.py <<< 30582 1726855390.90251: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/AnsiballZ_network_connections.py" <<< 30582 1726855390.90305: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpn1pgt76i" to remote "/root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/AnsiballZ_network_connections.py" <<< 30582 1726855390.90309: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/AnsiballZ_network_connections.py" <<< 30582 1726855390.91077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855390.91118: stderr chunk (state=3): >>><<< 30582 1726855390.91121: stdout chunk (state=3): >>><<< 30582 1726855390.91151: done transferring module to remote 30582 1726855390.91160: _low_level_execute_command(): starting 30582 1726855390.91164: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/ /root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/AnsiballZ_network_connections.py && sleep 0' 30582 1726855390.91599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.91602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.91605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855390.91607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855390.91609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.91659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.91666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855390.91669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.91725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855390.93479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855390.93591: stderr chunk (state=3): >>><<< 30582 1726855390.93594: stdout chunk (state=3): >>><<< 30582 1726855390.93597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855390.93600: _low_level_execute_command(): starting 30582 1726855390.93602: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/AnsiballZ_network_connections.py && sleep 0' 30582 1726855390.94134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855390.94152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855390.94173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855390.94285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855390.94293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855390.94326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855390.94342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855390.94376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855390.94475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855391.21011: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30582 1726855391.22803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855391.22833: stderr chunk (state=3): >>><<< 30582 1726855391.22837: stdout chunk (state=3): >>><<< 30582 1726855391.22853: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855391.22886: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855391.22895: _low_level_execute_command(): starting 30582 1726855391.22900: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855390.8508897-36038-194049096583811/ > /dev/null 2>&1 && sleep 0' 30582 1726855391.23353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855391.23357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855391.23359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.23361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855391.23366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.23418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855391.23421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855391.23424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855391.23485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855391.25320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855391.25347: stderr chunk (state=3): >>><<< 30582 1726855391.25350: stdout chunk (state=3): >>><<< 30582 1726855391.25367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855391.25372: handler run complete 30582 1726855391.25394: attempt loop complete, returning result 30582 1726855391.25398: _execute() done 30582 1726855391.25400: dumping result to json 30582 1726855391.25403: done dumping result, returning 30582 1726855391.25413: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-aa83-7d57-0000000026a4] 30582 1726855391.25420: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a4 30582 1726855391.25521: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a4 30582 1726855391.25524: WORKER PROCESS EXITING ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30582 1726855391.25644: no more pending results, returning what we have 30582 1726855391.25647: results queue empty 30582 1726855391.25648: checking for any_errors_fatal 30582 1726855391.25655: done checking for any_errors_fatal 30582 1726855391.25656: checking for max_fail_percentage 30582 1726855391.25657: done checking for max_fail_percentage 30582 1726855391.25658: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.25659: done checking to see if all hosts have failed 30582 1726855391.25659: getting the remaining hosts for this loop 30582 1726855391.25661: done getting the remaining hosts for this loop 30582 1726855391.25666: getting the next task for host managed_node3 30582 1726855391.25673: done getting next task for host managed_node3 30582 1726855391.25676: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855391.25681: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.25697: getting variables 30582 1726855391.25698: in VariableManager get_vars() 30582 1726855391.25740: Calling all_inventory to load vars for managed_node3 30582 1726855391.25743: Calling groups_inventory to load vars for managed_node3 30582 1726855391.25745: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.25754: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.25757: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.25759: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.26608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.27489: done with get_vars() 30582 1726855391.27508: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:03:11 -0400 (0:00:00.525) 0:02:07.625 ****** 30582 1726855391.27573: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855391.27840: worker is 1 (out of 1 available) 30582 1726855391.27854: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 30582 1726855391.27868: done queuing things up, now waiting for results queue to drain 30582 1726855391.27870: waiting for pending results... 30582 1726855391.28059: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 30582 1726855391.28148: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a5 30582 1726855391.28161: variable 'ansible_search_path' from source: unknown 30582 1726855391.28167: variable 'ansible_search_path' from source: unknown 30582 1726855391.28195: calling self._execute() 30582 1726855391.28273: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.28277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.28285: variable 'omit' from source: magic vars 30582 1726855391.28578: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.28590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.28674: variable 'network_state' from source: role '' defaults 30582 1726855391.28683: Evaluated conditional (network_state != {}): False 30582 1726855391.28686: when evaluation is False, skipping this task 30582 1726855391.28690: _execute() done 30582 1726855391.28693: dumping result to json 30582 1726855391.28695: done dumping result, returning 30582 1726855391.28703: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-aa83-7d57-0000000026a5] 30582 1726855391.28708: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a5 30582 1726855391.28796: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a5 30582 1726855391.28799: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30582 1726855391.28848: no more pending results, returning what we have 30582 1726855391.28852: results queue empty 30582 1726855391.28853: checking for any_errors_fatal 30582 1726855391.28868: done checking for any_errors_fatal 30582 1726855391.28869: checking for max_fail_percentage 30582 1726855391.28871: done checking for max_fail_percentage 30582 1726855391.28872: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.28873: done checking to see if all hosts have failed 30582 1726855391.28873: getting the remaining hosts for this loop 30582 1726855391.28875: done getting the remaining hosts for this loop 30582 1726855391.28878: getting the next task for host managed_node3 30582 1726855391.28886: done getting next task for host managed_node3 30582 1726855391.28895: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855391.28902: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.28933: getting variables 30582 1726855391.28935: in VariableManager get_vars() 30582 1726855391.28977: Calling all_inventory to load vars for managed_node3 30582 1726855391.28980: Calling groups_inventory to load vars for managed_node3 30582 1726855391.28983: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.28994: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.29000: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.29004: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.29973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.30835: done with get_vars() 30582 1726855391.30852: done getting variables 30582 1726855391.30899: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:03:11 -0400 (0:00:00.033) 0:02:07.659 ****** 30582 1726855391.30924: entering _queue_task() for managed_node3/debug 30582 1726855391.31177: worker is 1 (out of 1 available) 30582 1726855391.31194: exiting _queue_task() for managed_node3/debug 30582 1726855391.31205: done queuing things up, now waiting for results queue to drain 30582 1726855391.31207: waiting for pending results... 30582 1726855391.31390: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30582 1726855391.31492: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a6 30582 1726855391.31507: variable 'ansible_search_path' from source: unknown 30582 1726855391.31510: variable 'ansible_search_path' from source: unknown 30582 1726855391.31537: calling self._execute() 30582 1726855391.31609: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.31613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.31622: variable 'omit' from source: magic vars 30582 1726855391.31909: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.31919: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.31925: variable 'omit' from source: magic vars 30582 1726855391.31967: variable 'omit' from source: magic vars 30582 1726855391.31992: variable 'omit' from source: magic vars 30582 1726855391.32026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855391.32052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855391.32067: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855391.32086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.32098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.32121: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855391.32123: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.32126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.32198: Set connection var ansible_timeout to 10 30582 1726855391.32201: Set connection var ansible_connection to ssh 30582 1726855391.32209: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855391.32211: Set connection var ansible_pipelining to False 30582 1726855391.32214: Set connection var ansible_shell_executable to /bin/sh 30582 1726855391.32217: Set connection var ansible_shell_type to sh 30582 1726855391.32235: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.32237: variable 'ansible_connection' from source: unknown 30582 1726855391.32240: variable 'ansible_module_compression' from source: unknown 30582 1726855391.32242: variable 'ansible_shell_type' from source: unknown 30582 1726855391.32244: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.32246: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.32252: variable 'ansible_pipelining' from source: unknown 30582 1726855391.32254: variable 'ansible_timeout' from source: unknown 30582 1726855391.32258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.32358: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855391.32392: variable 'omit' from source: magic vars 30582 1726855391.32395: starting attempt loop 30582 1726855391.32398: running the handler 30582 1726855391.32469: variable '__network_connections_result' from source: set_fact 30582 1726855391.32510: handler run complete 30582 1726855391.32523: attempt loop complete, returning result 30582 1726855391.32526: _execute() done 30582 1726855391.32531: dumping result to json 30582 1726855391.32533: done dumping result, returning 30582 1726855391.32541: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-aa83-7d57-0000000026a6] 30582 1726855391.32546: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a6 30582 1726855391.32630: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a6 30582 1726855391.32632: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30582 1726855391.32721: no more pending results, returning what we have 30582 1726855391.32725: results queue empty 30582 1726855391.32726: checking for any_errors_fatal 30582 1726855391.32731: done checking for any_errors_fatal 30582 1726855391.32732: checking for max_fail_percentage 30582 1726855391.32733: done checking for max_fail_percentage 30582 1726855391.32734: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.32735: done checking to see if all hosts have failed 30582 1726855391.32736: getting the remaining hosts for this loop 30582 1726855391.32737: done getting the remaining hosts for this loop 30582 1726855391.32740: getting the next task for host managed_node3 30582 1726855391.32747: done getting next task for host managed_node3 30582 1726855391.32751: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855391.32755: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.32769: getting variables 30582 1726855391.32771: in VariableManager get_vars() 30582 1726855391.32809: Calling all_inventory to load vars for managed_node3 30582 1726855391.32812: Calling groups_inventory to load vars for managed_node3 30582 1726855391.32813: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.32822: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.32824: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.32826: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.33607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.34495: done with get_vars() 30582 1726855391.34512: done getting variables 30582 1726855391.34552: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:03:11 -0400 (0:00:00.036) 0:02:07.695 ****** 30582 1726855391.34583: entering _queue_task() for managed_node3/debug 30582 1726855391.34815: worker is 1 (out of 1 available) 30582 1726855391.34829: exiting _queue_task() for managed_node3/debug 30582 1726855391.34841: done queuing things up, now waiting for results queue to drain 30582 1726855391.34842: waiting for pending results... 30582 1726855391.35021: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30582 1726855391.35110: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a7 30582 1726855391.35121: variable 'ansible_search_path' from source: unknown 30582 1726855391.35124: variable 'ansible_search_path' from source: unknown 30582 1726855391.35150: calling self._execute() 30582 1726855391.35228: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.35235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.35239: variable 'omit' from source: magic vars 30582 1726855391.35521: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.35530: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.35536: variable 'omit' from source: magic vars 30582 1726855391.35578: variable 'omit' from source: magic vars 30582 1726855391.35603: variable 'omit' from source: magic vars 30582 1726855391.35637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855391.35662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855391.35679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855391.35694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.35704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.35729: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855391.35732: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.35735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.35804: Set connection var ansible_timeout to 10 30582 1726855391.35807: Set connection var ansible_connection to ssh 30582 1726855391.35812: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855391.35817: Set connection var ansible_pipelining to False 30582 1726855391.35823: Set connection var ansible_shell_executable to /bin/sh 30582 1726855391.35826: Set connection var ansible_shell_type to sh 30582 1726855391.35843: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.35846: variable 'ansible_connection' from source: unknown 30582 1726855391.35848: variable 'ansible_module_compression' from source: unknown 30582 1726855391.35851: variable 'ansible_shell_type' from source: unknown 30582 1726855391.35854: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.35856: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.35860: variable 'ansible_pipelining' from source: unknown 30582 1726855391.35865: variable 'ansible_timeout' from source: unknown 30582 1726855391.35867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.35967: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855391.35975: variable 'omit' from source: magic vars 30582 1726855391.35981: starting attempt loop 30582 1726855391.35983: running the handler 30582 1726855391.36022: variable '__network_connections_result' from source: set_fact 30582 1726855391.36077: variable '__network_connections_result' from source: set_fact 30582 1726855391.36154: handler run complete 30582 1726855391.36172: attempt loop complete, returning result 30582 1726855391.36175: _execute() done 30582 1726855391.36178: dumping result to json 30582 1726855391.36180: done dumping result, returning 30582 1726855391.36190: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-aa83-7d57-0000000026a7] 30582 1726855391.36194: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a7 30582 1726855391.36284: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a7 30582 1726855391.36289: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30582 1726855391.36379: no more pending results, returning what we have 30582 1726855391.36383: results queue empty 30582 1726855391.36384: checking for any_errors_fatal 30582 1726855391.36390: done checking for any_errors_fatal 30582 1726855391.36390: checking for max_fail_percentage 30582 1726855391.36392: done checking for max_fail_percentage 30582 1726855391.36393: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.36393: done checking to see if all hosts have failed 30582 1726855391.36394: getting the remaining hosts for this loop 30582 1726855391.36395: done getting the remaining hosts for this loop 30582 1726855391.36398: getting the next task for host managed_node3 30582 1726855391.36405: done getting next task for host managed_node3 30582 1726855391.36409: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855391.36413: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.36425: getting variables 30582 1726855391.36427: in VariableManager get_vars() 30582 1726855391.36461: Calling all_inventory to load vars for managed_node3 30582 1726855391.36466: Calling groups_inventory to load vars for managed_node3 30582 1726855391.36468: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.36481: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.36483: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.36485: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.37407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.38268: done with get_vars() 30582 1726855391.38289: done getting variables 30582 1726855391.38331: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:03:11 -0400 (0:00:00.037) 0:02:07.733 ****** 30582 1726855391.38355: entering _queue_task() for managed_node3/debug 30582 1726855391.38606: worker is 1 (out of 1 available) 30582 1726855391.38621: exiting _queue_task() for managed_node3/debug 30582 1726855391.38632: done queuing things up, now waiting for results queue to drain 30582 1726855391.38633: waiting for pending results... 30582 1726855391.38820: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30582 1726855391.38920: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a8 30582 1726855391.38931: variable 'ansible_search_path' from source: unknown 30582 1726855391.38935: variable 'ansible_search_path' from source: unknown 30582 1726855391.38967: calling self._execute() 30582 1726855391.39036: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.39040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.39051: variable 'omit' from source: magic vars 30582 1726855391.39330: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.39339: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.39426: variable 'network_state' from source: role '' defaults 30582 1726855391.39436: Evaluated conditional (network_state != {}): False 30582 1726855391.39439: when evaluation is False, skipping this task 30582 1726855391.39442: _execute() done 30582 1726855391.39444: dumping result to json 30582 1726855391.39447: done dumping result, returning 30582 1726855391.39456: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-aa83-7d57-0000000026a8] 30582 1726855391.39460: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a8 30582 1726855391.39549: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a8 30582 1726855391.39551: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 30582 1726855391.39602: no more pending results, returning what we have 30582 1726855391.39606: results queue empty 30582 1726855391.39607: checking for any_errors_fatal 30582 1726855391.39618: done checking for any_errors_fatal 30582 1726855391.39619: checking for max_fail_percentage 30582 1726855391.39621: done checking for max_fail_percentage 30582 1726855391.39622: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.39622: done checking to see if all hosts have failed 30582 1726855391.39623: getting the remaining hosts for this loop 30582 1726855391.39624: done getting the remaining hosts for this loop 30582 1726855391.39628: getting the next task for host managed_node3 30582 1726855391.39636: done getting next task for host managed_node3 30582 1726855391.39639: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855391.39645: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.39674: getting variables 30582 1726855391.39676: in VariableManager get_vars() 30582 1726855391.39720: Calling all_inventory to load vars for managed_node3 30582 1726855391.39723: Calling groups_inventory to load vars for managed_node3 30582 1726855391.39725: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.39733: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.39736: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.39738: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.40531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.41396: done with get_vars() 30582 1726855391.41412: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:03:11 -0400 (0:00:00.031) 0:02:07.764 ****** 30582 1726855391.41480: entering _queue_task() for managed_node3/ping 30582 1726855391.41703: worker is 1 (out of 1 available) 30582 1726855391.41716: exiting _queue_task() for managed_node3/ping 30582 1726855391.41728: done queuing things up, now waiting for results queue to drain 30582 1726855391.41729: waiting for pending results... 30582 1726855391.41906: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 30582 1726855391.41992: in run() - task 0affcc66-ac2b-aa83-7d57-0000000026a9 30582 1726855391.42005: variable 'ansible_search_path' from source: unknown 30582 1726855391.42008: variable 'ansible_search_path' from source: unknown 30582 1726855391.42033: calling self._execute() 30582 1726855391.42106: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.42110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.42118: variable 'omit' from source: magic vars 30582 1726855391.42392: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.42402: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.42405: variable 'omit' from source: magic vars 30582 1726855391.42451: variable 'omit' from source: magic vars 30582 1726855391.42475: variable 'omit' from source: magic vars 30582 1726855391.42511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855391.42535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855391.42552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855391.42567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.42575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.42601: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855391.42604: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.42607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.42677: Set connection var ansible_timeout to 10 30582 1726855391.42680: Set connection var ansible_connection to ssh 30582 1726855391.42685: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855391.42691: Set connection var ansible_pipelining to False 30582 1726855391.42696: Set connection var ansible_shell_executable to /bin/sh 30582 1726855391.42699: Set connection var ansible_shell_type to sh 30582 1726855391.42717: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.42719: variable 'ansible_connection' from source: unknown 30582 1726855391.42724: variable 'ansible_module_compression' from source: unknown 30582 1726855391.42726: variable 'ansible_shell_type' from source: unknown 30582 1726855391.42728: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.42730: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.42732: variable 'ansible_pipelining' from source: unknown 30582 1726855391.42734: variable 'ansible_timeout' from source: unknown 30582 1726855391.42740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.42879: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855391.42889: variable 'omit' from source: magic vars 30582 1726855391.42894: starting attempt loop 30582 1726855391.42897: running the handler 30582 1726855391.42908: _low_level_execute_command(): starting 30582 1726855391.42915: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855391.43431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855391.43435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855391.43438: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855391.43441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.43493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855391.43498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855391.43508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855391.43572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855391.45268: stdout chunk (state=3): >>>/root <<< 30582 1726855391.45367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855391.45396: stderr chunk (state=3): >>><<< 30582 1726855391.45399: stdout chunk (state=3): >>><<< 30582 1726855391.45420: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855391.45436: _low_level_execute_command(): starting 30582 1726855391.45441: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081 `" && echo ansible-tmp-1726855391.4542043-36053-40433979785081="` echo /root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081 `" ) && sleep 0' 30582 1726855391.45882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855391.45885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.45897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855391.45900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.45948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855391.45953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855391.45955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855391.46012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855391.47911: stdout chunk (state=3): >>>ansible-tmp-1726855391.4542043-36053-40433979785081=/root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081 <<< 30582 1726855391.48016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855391.48041: stderr chunk (state=3): >>><<< 30582 1726855391.48044: stdout chunk (state=3): >>><<< 30582 1726855391.48061: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855391.4542043-36053-40433979785081=/root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855391.48101: variable 'ansible_module_compression' from source: unknown 30582 1726855391.48134: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30582 1726855391.48167: variable 'ansible_facts' from source: unknown 30582 1726855391.48219: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/AnsiballZ_ping.py 30582 1726855391.48317: Sending initial data 30582 1726855391.48320: Sent initial data (152 bytes) 30582 1726855391.48754: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855391.48757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855391.48759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.48761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855391.48767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.48818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855391.48824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855391.48880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855391.50439: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30582 1726855391.50443: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855391.50492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855391.50553: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp34pdydrg /root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/AnsiballZ_ping.py <<< 30582 1726855391.50556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/AnsiballZ_ping.py" <<< 30582 1726855391.50609: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp34pdydrg" to remote "/root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/AnsiballZ_ping.py" <<< 30582 1726855391.51181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855391.51222: stderr chunk (state=3): >>><<< 30582 1726855391.51225: stdout chunk (state=3): >>><<< 30582 1726855391.51245: done transferring module to remote 30582 1726855391.51253: _low_level_execute_command(): starting 30582 1726855391.51258: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/ /root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/AnsiballZ_ping.py && sleep 0' 30582 1726855391.51678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855391.51681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855391.51684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.51686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855391.51695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.51741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855391.51747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855391.51803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855391.53553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855391.53580: stderr chunk (state=3): >>><<< 30582 1726855391.53583: stdout chunk (state=3): >>><<< 30582 1726855391.53598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855391.53601: _low_level_execute_command(): starting 30582 1726855391.53605: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/AnsiballZ_ping.py && sleep 0' 30582 1726855391.54021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855391.54024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855391.54027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855391.54029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855391.54031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.54079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855391.54082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855391.54155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855391.69024: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30582 1726855391.70333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855391.70359: stderr chunk (state=3): >>><<< 30582 1726855391.70362: stdout chunk (state=3): >>><<< 30582 1726855391.70385: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855391.70409: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855391.70418: _low_level_execute_command(): starting 30582 1726855391.70423: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855391.4542043-36053-40433979785081/ > /dev/null 2>&1 && sleep 0' 30582 1726855391.70873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855391.70877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855391.70879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855391.70881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855391.70883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.70933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855391.70938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855391.70944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855391.71000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855391.72827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855391.72851: stderr chunk (state=3): >>><<< 30582 1726855391.72854: stdout chunk (state=3): >>><<< 30582 1726855391.72873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855391.72876: handler run complete 30582 1726855391.72890: attempt loop complete, returning result 30582 1726855391.72893: _execute() done 30582 1726855391.72896: dumping result to json 30582 1726855391.72898: done dumping result, returning 30582 1726855391.72907: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-aa83-7d57-0000000026a9] 30582 1726855391.72912: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a9 30582 1726855391.73006: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000026a9 30582 1726855391.73009: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 30582 1726855391.73091: no more pending results, returning what we have 30582 1726855391.73094: results queue empty 30582 1726855391.73095: checking for any_errors_fatal 30582 1726855391.73104: done checking for any_errors_fatal 30582 1726855391.73105: checking for max_fail_percentage 30582 1726855391.73106: done checking for max_fail_percentage 30582 1726855391.73107: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.73108: done checking to see if all hosts have failed 30582 1726855391.73109: getting the remaining hosts for this loop 30582 1726855391.73110: done getting the remaining hosts for this loop 30582 1726855391.73114: getting the next task for host managed_node3 30582 1726855391.73128: done getting next task for host managed_node3 30582 1726855391.73131: ^ task is: TASK: meta (role_complete) 30582 1726855391.73136: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.73150: getting variables 30582 1726855391.73151: in VariableManager get_vars() 30582 1726855391.73201: Calling all_inventory to load vars for managed_node3 30582 1726855391.73203: Calling groups_inventory to load vars for managed_node3 30582 1726855391.73205: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.73215: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.73218: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.73220: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.74166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.75035: done with get_vars() 30582 1726855391.75053: done getting variables 30582 1726855391.75116: done queuing things up, now waiting for results queue to drain 30582 1726855391.75118: results queue empty 30582 1726855391.75118: checking for any_errors_fatal 30582 1726855391.75120: done checking for any_errors_fatal 30582 1726855391.75120: checking for max_fail_percentage 30582 1726855391.75121: done checking for max_fail_percentage 30582 1726855391.75122: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.75122: done checking to see if all hosts have failed 30582 1726855391.75122: getting the remaining hosts for this loop 30582 1726855391.75123: done getting the remaining hosts for this loop 30582 1726855391.75125: getting the next task for host managed_node3 30582 1726855391.75128: done getting next task for host managed_node3 30582 1726855391.75130: ^ task is: TASK: Asserts 30582 1726855391.75131: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.75133: getting variables 30582 1726855391.75134: in VariableManager get_vars() 30582 1726855391.75142: Calling all_inventory to load vars for managed_node3 30582 1726855391.75143: Calling groups_inventory to load vars for managed_node3 30582 1726855391.75145: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.75148: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.75149: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.75151: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.75790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.76731: done with get_vars() 30582 1726855391.76745: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 14:03:11 -0400 (0:00:00.353) 0:02:08.117 ****** 30582 1726855391.76804: entering _queue_task() for managed_node3/include_tasks 30582 1726855391.77082: worker is 1 (out of 1 available) 30582 1726855391.77098: exiting _queue_task() for managed_node3/include_tasks 30582 1726855391.77109: done queuing things up, now waiting for results queue to drain 30582 1726855391.77111: waiting for pending results... 30582 1726855391.77301: running TaskExecutor() for managed_node3/TASK: Asserts 30582 1726855391.77380: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020b2 30582 1726855391.77393: variable 'ansible_search_path' from source: unknown 30582 1726855391.77397: variable 'ansible_search_path' from source: unknown 30582 1726855391.77432: variable 'lsr_assert' from source: include params 30582 1726855391.77599: variable 'lsr_assert' from source: include params 30582 1726855391.77652: variable 'omit' from source: magic vars 30582 1726855391.77751: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.77757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.77768: variable 'omit' from source: magic vars 30582 1726855391.77935: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.77942: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.77948: variable 'item' from source: unknown 30582 1726855391.77997: variable 'item' from source: unknown 30582 1726855391.78017: variable 'item' from source: unknown 30582 1726855391.78059: variable 'item' from source: unknown 30582 1726855391.78194: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.78198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.78200: variable 'omit' from source: magic vars 30582 1726855391.78293: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.78296: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.78299: variable 'item' from source: unknown 30582 1726855391.78322: variable 'item' from source: unknown 30582 1726855391.78345: variable 'item' from source: unknown 30582 1726855391.78391: variable 'item' from source: unknown 30582 1726855391.78455: dumping result to json 30582 1726855391.78458: done dumping result, returning 30582 1726855391.78460: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcc66-ac2b-aa83-7d57-0000000020b2] 30582 1726855391.78462: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b2 30582 1726855391.78496: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b2 30582 1726855391.78499: WORKER PROCESS EXITING 30582 1726855391.78523: no more pending results, returning what we have 30582 1726855391.78528: in VariableManager get_vars() 30582 1726855391.78580: Calling all_inventory to load vars for managed_node3 30582 1726855391.78582: Calling groups_inventory to load vars for managed_node3 30582 1726855391.78585: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.78600: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.78603: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.78606: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.79431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.80279: done with get_vars() 30582 1726855391.80295: variable 'ansible_search_path' from source: unknown 30582 1726855391.80296: variable 'ansible_search_path' from source: unknown 30582 1726855391.80325: variable 'ansible_search_path' from source: unknown 30582 1726855391.80326: variable 'ansible_search_path' from source: unknown 30582 1726855391.80341: we have included files to process 30582 1726855391.80342: generating all_blocks data 30582 1726855391.80344: done generating all_blocks data 30582 1726855391.80347: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30582 1726855391.80348: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30582 1726855391.80350: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30582 1726855391.80424: in VariableManager get_vars() 30582 1726855391.80439: done with get_vars() 30582 1726855391.80518: done processing included file 30582 1726855391.80519: iterating over new_blocks loaded from include file 30582 1726855391.80520: in VariableManager get_vars() 30582 1726855391.80533: done with get_vars() 30582 1726855391.80534: filtering new block on tags 30582 1726855391.80555: done filtering new block on tags 30582 1726855391.80557: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 => (item=tasks/assert_profile_absent.yml) 30582 1726855391.80560: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30582 1726855391.80560: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30582 1726855391.80565: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30582 1726855391.80806: done processing included file 30582 1726855391.80808: iterating over new_blocks loaded from include file 30582 1726855391.80808: in VariableManager get_vars() 30582 1726855391.80819: done with get_vars() 30582 1726855391.80820: filtering new block on tags 30582 1726855391.80845: done filtering new block on tags 30582 1726855391.80847: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml for managed_node3 => (item=tasks/get_NetworkManager_NVR.yml) 30582 1726855391.80850: extending task lists for all hosts with included blocks 30582 1726855391.81480: done extending task lists 30582 1726855391.81481: done processing included files 30582 1726855391.81482: results queue empty 30582 1726855391.81482: checking for any_errors_fatal 30582 1726855391.81484: done checking for any_errors_fatal 30582 1726855391.81484: checking for max_fail_percentage 30582 1726855391.81485: done checking for max_fail_percentage 30582 1726855391.81485: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.81486: done checking to see if all hosts have failed 30582 1726855391.81486: getting the remaining hosts for this loop 30582 1726855391.81489: done getting the remaining hosts for this loop 30582 1726855391.81490: getting the next task for host managed_node3 30582 1726855391.81493: done getting next task for host managed_node3 30582 1726855391.81495: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30582 1726855391.81497: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.81499: getting variables 30582 1726855391.81504: in VariableManager get_vars() 30582 1726855391.81511: Calling all_inventory to load vars for managed_node3 30582 1726855391.81513: Calling groups_inventory to load vars for managed_node3 30582 1726855391.81514: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.81518: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.81519: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.81521: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.86730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.87597: done with get_vars() 30582 1726855391.87618: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 14:03:11 -0400 (0:00:00.108) 0:02:08.226 ****** 30582 1726855391.87680: entering _queue_task() for managed_node3/include_tasks 30582 1726855391.87967: worker is 1 (out of 1 available) 30582 1726855391.87980: exiting _queue_task() for managed_node3/include_tasks 30582 1726855391.87994: done queuing things up, now waiting for results queue to drain 30582 1726855391.87996: waiting for pending results... 30582 1726855391.88192: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 30582 1726855391.88298: in run() - task 0affcc66-ac2b-aa83-7d57-000000002804 30582 1726855391.88309: variable 'ansible_search_path' from source: unknown 30582 1726855391.88312: variable 'ansible_search_path' from source: unknown 30582 1726855391.88343: calling self._execute() 30582 1726855391.88420: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.88424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.88431: variable 'omit' from source: magic vars 30582 1726855391.88728: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.88737: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.88743: _execute() done 30582 1726855391.88747: dumping result to json 30582 1726855391.88750: done dumping result, returning 30582 1726855391.88757: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-aa83-7d57-000000002804] 30582 1726855391.88763: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002804 30582 1726855391.88885: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002804 30582 1726855391.88890: WORKER PROCESS EXITING 30582 1726855391.88919: no more pending results, returning what we have 30582 1726855391.88925: in VariableManager get_vars() 30582 1726855391.88978: Calling all_inventory to load vars for managed_node3 30582 1726855391.88981: Calling groups_inventory to load vars for managed_node3 30582 1726855391.88984: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.88998: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.89001: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.89004: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.89820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.90693: done with get_vars() 30582 1726855391.90709: variable 'ansible_search_path' from source: unknown 30582 1726855391.90710: variable 'ansible_search_path' from source: unknown 30582 1726855391.90717: variable 'item' from source: include params 30582 1726855391.90802: variable 'item' from source: include params 30582 1726855391.90829: we have included files to process 30582 1726855391.90829: generating all_blocks data 30582 1726855391.90831: done generating all_blocks data 30582 1726855391.90831: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855391.90832: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855391.90834: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30582 1726855391.91430: done processing included file 30582 1726855391.91431: iterating over new_blocks loaded from include file 30582 1726855391.91432: in VariableManager get_vars() 30582 1726855391.91445: done with get_vars() 30582 1726855391.91446: filtering new block on tags 30582 1726855391.91492: done filtering new block on tags 30582 1726855391.91495: in VariableManager get_vars() 30582 1726855391.91506: done with get_vars() 30582 1726855391.91507: filtering new block on tags 30582 1726855391.91537: done filtering new block on tags 30582 1726855391.91539: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 30582 1726855391.91543: extending task lists for all hosts with included blocks 30582 1726855391.91693: done extending task lists 30582 1726855391.91694: done processing included files 30582 1726855391.91695: results queue empty 30582 1726855391.91695: checking for any_errors_fatal 30582 1726855391.91698: done checking for any_errors_fatal 30582 1726855391.91699: checking for max_fail_percentage 30582 1726855391.91699: done checking for max_fail_percentage 30582 1726855391.91700: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.91701: done checking to see if all hosts have failed 30582 1726855391.91701: getting the remaining hosts for this loop 30582 1726855391.91702: done getting the remaining hosts for this loop 30582 1726855391.91704: getting the next task for host managed_node3 30582 1726855391.91707: done getting next task for host managed_node3 30582 1726855391.91708: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855391.91710: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.91712: getting variables 30582 1726855391.91712: in VariableManager get_vars() 30582 1726855391.91720: Calling all_inventory to load vars for managed_node3 30582 1726855391.91721: Calling groups_inventory to load vars for managed_node3 30582 1726855391.91722: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.91726: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.91728: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.91729: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.92385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.93235: done with get_vars() 30582 1726855391.93252: done getting variables 30582 1726855391.93282: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 14:03:11 -0400 (0:00:00.056) 0:02:08.282 ****** 30582 1726855391.93308: entering _queue_task() for managed_node3/set_fact 30582 1726855391.93584: worker is 1 (out of 1 available) 30582 1726855391.93598: exiting _queue_task() for managed_node3/set_fact 30582 1726855391.93611: done queuing things up, now waiting for results queue to drain 30582 1726855391.93613: waiting for pending results... 30582 1726855391.93805: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 30582 1726855391.93889: in run() - task 0affcc66-ac2b-aa83-7d57-000000002888 30582 1726855391.93901: variable 'ansible_search_path' from source: unknown 30582 1726855391.93904: variable 'ansible_search_path' from source: unknown 30582 1726855391.93931: calling self._execute() 30582 1726855391.94004: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.94008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.94018: variable 'omit' from source: magic vars 30582 1726855391.94302: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.94311: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.94317: variable 'omit' from source: magic vars 30582 1726855391.94357: variable 'omit' from source: magic vars 30582 1726855391.94390: variable 'omit' from source: magic vars 30582 1726855391.94420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855391.94447: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855391.94464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855391.94481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.94493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.94518: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855391.94522: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.94524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.94597: Set connection var ansible_timeout to 10 30582 1726855391.94601: Set connection var ansible_connection to ssh 30582 1726855391.94606: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855391.94610: Set connection var ansible_pipelining to False 30582 1726855391.94616: Set connection var ansible_shell_executable to /bin/sh 30582 1726855391.94618: Set connection var ansible_shell_type to sh 30582 1726855391.94633: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.94636: variable 'ansible_connection' from source: unknown 30582 1726855391.94639: variable 'ansible_module_compression' from source: unknown 30582 1726855391.94641: variable 'ansible_shell_type' from source: unknown 30582 1726855391.94645: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.94647: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.94649: variable 'ansible_pipelining' from source: unknown 30582 1726855391.94652: variable 'ansible_timeout' from source: unknown 30582 1726855391.94658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.94760: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855391.94772: variable 'omit' from source: magic vars 30582 1726855391.94777: starting attempt loop 30582 1726855391.94779: running the handler 30582 1726855391.94793: handler run complete 30582 1726855391.94801: attempt loop complete, returning result 30582 1726855391.94803: _execute() done 30582 1726855391.94806: dumping result to json 30582 1726855391.94808: done dumping result, returning 30582 1726855391.94815: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-aa83-7d57-000000002888] 30582 1726855391.94822: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002888 30582 1726855391.94967: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002888 30582 1726855391.94970: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30582 1726855391.95234: no more pending results, returning what we have 30582 1726855391.95236: results queue empty 30582 1726855391.95238: checking for any_errors_fatal 30582 1726855391.95239: done checking for any_errors_fatal 30582 1726855391.95240: checking for max_fail_percentage 30582 1726855391.95242: done checking for max_fail_percentage 30582 1726855391.95243: checking to see if all hosts have failed and the running result is not ok 30582 1726855391.95243: done checking to see if all hosts have failed 30582 1726855391.95244: getting the remaining hosts for this loop 30582 1726855391.95245: done getting the remaining hosts for this loop 30582 1726855391.95248: getting the next task for host managed_node3 30582 1726855391.95255: done getting next task for host managed_node3 30582 1726855391.95257: ^ task is: TASK: Stat profile file 30582 1726855391.95262: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855391.95265: getting variables 30582 1726855391.95266: in VariableManager get_vars() 30582 1726855391.95303: Calling all_inventory to load vars for managed_node3 30582 1726855391.95306: Calling groups_inventory to load vars for managed_node3 30582 1726855391.95309: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855391.95318: Calling all_plugins_play to load vars for managed_node3 30582 1726855391.95321: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855391.95324: Calling groups_plugins_play to load vars for managed_node3 30582 1726855391.96585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855391.97583: done with get_vars() 30582 1726855391.97601: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 14:03:11 -0400 (0:00:00.043) 0:02:08.326 ****** 30582 1726855391.97668: entering _queue_task() for managed_node3/stat 30582 1726855391.97922: worker is 1 (out of 1 available) 30582 1726855391.97937: exiting _queue_task() for managed_node3/stat 30582 1726855391.97947: done queuing things up, now waiting for results queue to drain 30582 1726855391.97949: waiting for pending results... 30582 1726855391.98131: running TaskExecutor() for managed_node3/TASK: Stat profile file 30582 1726855391.98224: in run() - task 0affcc66-ac2b-aa83-7d57-000000002889 30582 1726855391.98236: variable 'ansible_search_path' from source: unknown 30582 1726855391.98240: variable 'ansible_search_path' from source: unknown 30582 1726855391.98267: calling self._execute() 30582 1726855391.98341: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.98345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.98354: variable 'omit' from source: magic vars 30582 1726855391.98637: variable 'ansible_distribution_major_version' from source: facts 30582 1726855391.98647: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855391.98653: variable 'omit' from source: magic vars 30582 1726855391.98690: variable 'omit' from source: magic vars 30582 1726855391.98762: variable 'profile' from source: play vars 30582 1726855391.98768: variable 'interface' from source: play vars 30582 1726855391.98818: variable 'interface' from source: play vars 30582 1726855391.98833: variable 'omit' from source: magic vars 30582 1726855391.98868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855391.98898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855391.98915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855391.98929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.98939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855391.98963: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855391.98970: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.98973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.99046: Set connection var ansible_timeout to 10 30582 1726855391.99049: Set connection var ansible_connection to ssh 30582 1726855391.99054: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855391.99060: Set connection var ansible_pipelining to False 30582 1726855391.99063: Set connection var ansible_shell_executable to /bin/sh 30582 1726855391.99069: Set connection var ansible_shell_type to sh 30582 1726855391.99086: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.99090: variable 'ansible_connection' from source: unknown 30582 1726855391.99093: variable 'ansible_module_compression' from source: unknown 30582 1726855391.99096: variable 'ansible_shell_type' from source: unknown 30582 1726855391.99098: variable 'ansible_shell_executable' from source: unknown 30582 1726855391.99101: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855391.99103: variable 'ansible_pipelining' from source: unknown 30582 1726855391.99106: variable 'ansible_timeout' from source: unknown 30582 1726855391.99108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855391.99255: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855391.99265: variable 'omit' from source: magic vars 30582 1726855391.99274: starting attempt loop 30582 1726855391.99278: running the handler 30582 1726855391.99292: _low_level_execute_command(): starting 30582 1726855391.99299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855391.99807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855391.99811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855391.99814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855391.99817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855391.99869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855391.99872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855391.99874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855391.99944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.01626: stdout chunk (state=3): >>>/root <<< 30582 1726855392.01726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.01765: stderr chunk (state=3): >>><<< 30582 1726855392.01767: stdout chunk (state=3): >>><<< 30582 1726855392.01785: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.01798: _low_level_execute_command(): starting 30582 1726855392.01804: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876 `" && echo ansible-tmp-1726855392.0178432-36067-52575609043876="` echo /root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876 `" ) && sleep 0' 30582 1726855392.02240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.02244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855392.02255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855392.02257: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855392.02259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.02306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.02310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.02375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.04282: stdout chunk (state=3): >>>ansible-tmp-1726855392.0178432-36067-52575609043876=/root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876 <<< 30582 1726855392.04390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.04417: stderr chunk (state=3): >>><<< 30582 1726855392.04421: stdout chunk (state=3): >>><<< 30582 1726855392.04438: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855392.0178432-36067-52575609043876=/root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.04481: variable 'ansible_module_compression' from source: unknown 30582 1726855392.04531: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855392.04564: variable 'ansible_facts' from source: unknown 30582 1726855392.04627: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/AnsiballZ_stat.py 30582 1726855392.04732: Sending initial data 30582 1726855392.04735: Sent initial data (152 bytes) 30582 1726855392.05185: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.05190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855392.05193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.05195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855392.05197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855392.05199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.05250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.05254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855392.05259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.05318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.06873: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855392.06927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855392.06984: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpop_3yzqu /root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/AnsiballZ_stat.py <<< 30582 1726855392.06990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/AnsiballZ_stat.py" <<< 30582 1726855392.07047: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpop_3yzqu" to remote "/root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/AnsiballZ_stat.py" <<< 30582 1726855392.07050: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/AnsiballZ_stat.py" <<< 30582 1726855392.07630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.07675: stderr chunk (state=3): >>><<< 30582 1726855392.07679: stdout chunk (state=3): >>><<< 30582 1726855392.07719: done transferring module to remote 30582 1726855392.07728: _low_level_execute_command(): starting 30582 1726855392.07733: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/ /root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/AnsiballZ_stat.py && sleep 0' 30582 1726855392.08191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.08194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855392.08197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.08199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855392.08205: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.08207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.08255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.08261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855392.08264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.08318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.10070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.10097: stderr chunk (state=3): >>><<< 30582 1726855392.10101: stdout chunk (state=3): >>><<< 30582 1726855392.10116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.10119: _low_level_execute_command(): starting 30582 1726855392.10122: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/AnsiballZ_stat.py && sleep 0' 30582 1726855392.10571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.10574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855392.10576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.10578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.10580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.10636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.10643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855392.10645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.10706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.25897: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855392.27183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855392.27212: stderr chunk (state=3): >>><<< 30582 1726855392.27215: stdout chunk (state=3): >>><<< 30582 1726855392.27230: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855392.27257: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855392.27267: _low_level_execute_command(): starting 30582 1726855392.27270: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855392.0178432-36067-52575609043876/ > /dev/null 2>&1 && sleep 0' 30582 1726855392.27729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.27732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.27734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.27742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.27805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.27808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855392.27810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.27868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.29702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.29728: stderr chunk (state=3): >>><<< 30582 1726855392.29732: stdout chunk (state=3): >>><<< 30582 1726855392.29745: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.29751: handler run complete 30582 1726855392.29769: attempt loop complete, returning result 30582 1726855392.29772: _execute() done 30582 1726855392.29776: dumping result to json 30582 1726855392.29779: done dumping result, returning 30582 1726855392.29788: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcc66-ac2b-aa83-7d57-000000002889] 30582 1726855392.29793: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002889 30582 1726855392.29893: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002889 30582 1726855392.29896: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855392.29951: no more pending results, returning what we have 30582 1726855392.29954: results queue empty 30582 1726855392.29955: checking for any_errors_fatal 30582 1726855392.29964: done checking for any_errors_fatal 30582 1726855392.29965: checking for max_fail_percentage 30582 1726855392.29967: done checking for max_fail_percentage 30582 1726855392.29968: checking to see if all hosts have failed and the running result is not ok 30582 1726855392.29969: done checking to see if all hosts have failed 30582 1726855392.29969: getting the remaining hosts for this loop 30582 1726855392.29971: done getting the remaining hosts for this loop 30582 1726855392.29975: getting the next task for host managed_node3 30582 1726855392.29982: done getting next task for host managed_node3 30582 1726855392.29984: ^ task is: TASK: Set NM profile exist flag based on the profile files 30582 1726855392.29991: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855392.29996: getting variables 30582 1726855392.29997: in VariableManager get_vars() 30582 1726855392.30045: Calling all_inventory to load vars for managed_node3 30582 1726855392.30048: Calling groups_inventory to load vars for managed_node3 30582 1726855392.30051: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855392.30062: Calling all_plugins_play to load vars for managed_node3 30582 1726855392.30065: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855392.30068: Calling groups_plugins_play to load vars for managed_node3 30582 1726855392.30926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855392.31805: done with get_vars() 30582 1726855392.31824: done getting variables 30582 1726855392.31869: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 14:03:12 -0400 (0:00:00.342) 0:02:08.668 ****** 30582 1726855392.31895: entering _queue_task() for managed_node3/set_fact 30582 1726855392.32148: worker is 1 (out of 1 available) 30582 1726855392.32161: exiting _queue_task() for managed_node3/set_fact 30582 1726855392.32173: done queuing things up, now waiting for results queue to drain 30582 1726855392.32175: waiting for pending results... 30582 1726855392.32360: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 30582 1726855392.32441: in run() - task 0affcc66-ac2b-aa83-7d57-00000000288a 30582 1726855392.32454: variable 'ansible_search_path' from source: unknown 30582 1726855392.32458: variable 'ansible_search_path' from source: unknown 30582 1726855392.32490: calling self._execute() 30582 1726855392.32563: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.32569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.32578: variable 'omit' from source: magic vars 30582 1726855392.32861: variable 'ansible_distribution_major_version' from source: facts 30582 1726855392.32874: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855392.32965: variable 'profile_stat' from source: set_fact 30582 1726855392.32976: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855392.32979: when evaluation is False, skipping this task 30582 1726855392.32982: _execute() done 30582 1726855392.32984: dumping result to json 30582 1726855392.32988: done dumping result, returning 30582 1726855392.32996: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-aa83-7d57-00000000288a] 30582 1726855392.33000: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288a 30582 1726855392.33086: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288a 30582 1726855392.33091: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855392.33138: no more pending results, returning what we have 30582 1726855392.33143: results queue empty 30582 1726855392.33144: checking for any_errors_fatal 30582 1726855392.33153: done checking for any_errors_fatal 30582 1726855392.33154: checking for max_fail_percentage 30582 1726855392.33155: done checking for max_fail_percentage 30582 1726855392.33156: checking to see if all hosts have failed and the running result is not ok 30582 1726855392.33157: done checking to see if all hosts have failed 30582 1726855392.33158: getting the remaining hosts for this loop 30582 1726855392.33159: done getting the remaining hosts for this loop 30582 1726855392.33163: getting the next task for host managed_node3 30582 1726855392.33171: done getting next task for host managed_node3 30582 1726855392.33174: ^ task is: TASK: Get NM profile info 30582 1726855392.33179: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855392.33183: getting variables 30582 1726855392.33185: in VariableManager get_vars() 30582 1726855392.33233: Calling all_inventory to load vars for managed_node3 30582 1726855392.33236: Calling groups_inventory to load vars for managed_node3 30582 1726855392.33239: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855392.33250: Calling all_plugins_play to load vars for managed_node3 30582 1726855392.33252: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855392.33255: Calling groups_plugins_play to load vars for managed_node3 30582 1726855392.34229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855392.35077: done with get_vars() 30582 1726855392.35096: done getting variables 30582 1726855392.35142: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 14:03:12 -0400 (0:00:00.032) 0:02:08.701 ****** 30582 1726855392.35168: entering _queue_task() for managed_node3/shell 30582 1726855392.35426: worker is 1 (out of 1 available) 30582 1726855392.35439: exiting _queue_task() for managed_node3/shell 30582 1726855392.35452: done queuing things up, now waiting for results queue to drain 30582 1726855392.35454: waiting for pending results... 30582 1726855392.35639: running TaskExecutor() for managed_node3/TASK: Get NM profile info 30582 1726855392.35726: in run() - task 0affcc66-ac2b-aa83-7d57-00000000288b 30582 1726855392.35740: variable 'ansible_search_path' from source: unknown 30582 1726855392.35744: variable 'ansible_search_path' from source: unknown 30582 1726855392.35774: calling self._execute() 30582 1726855392.35847: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.35851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.35861: variable 'omit' from source: magic vars 30582 1726855392.36151: variable 'ansible_distribution_major_version' from source: facts 30582 1726855392.36160: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855392.36169: variable 'omit' from source: magic vars 30582 1726855392.36209: variable 'omit' from source: magic vars 30582 1726855392.36285: variable 'profile' from source: play vars 30582 1726855392.36292: variable 'interface' from source: play vars 30582 1726855392.36340: variable 'interface' from source: play vars 30582 1726855392.36355: variable 'omit' from source: magic vars 30582 1726855392.36392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855392.36421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855392.36437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855392.36452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855392.36462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855392.36491: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855392.36494: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.36497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.36570: Set connection var ansible_timeout to 10 30582 1726855392.36573: Set connection var ansible_connection to ssh 30582 1726855392.36579: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855392.36584: Set connection var ansible_pipelining to False 30582 1726855392.36590: Set connection var ansible_shell_executable to /bin/sh 30582 1726855392.36594: Set connection var ansible_shell_type to sh 30582 1726855392.36609: variable 'ansible_shell_executable' from source: unknown 30582 1726855392.36612: variable 'ansible_connection' from source: unknown 30582 1726855392.36615: variable 'ansible_module_compression' from source: unknown 30582 1726855392.36617: variable 'ansible_shell_type' from source: unknown 30582 1726855392.36619: variable 'ansible_shell_executable' from source: unknown 30582 1726855392.36621: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.36624: variable 'ansible_pipelining' from source: unknown 30582 1726855392.36627: variable 'ansible_timeout' from source: unknown 30582 1726855392.36631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.36735: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855392.36744: variable 'omit' from source: magic vars 30582 1726855392.36749: starting attempt loop 30582 1726855392.36752: running the handler 30582 1726855392.36761: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855392.36781: _low_level_execute_command(): starting 30582 1726855392.36788: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855392.37300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.37305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855392.37307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855392.37312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.37347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.37361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.37438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.39115: stdout chunk (state=3): >>>/root <<< 30582 1726855392.39213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.39243: stderr chunk (state=3): >>><<< 30582 1726855392.39248: stdout chunk (state=3): >>><<< 30582 1726855392.39276: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.39289: _low_level_execute_command(): starting 30582 1726855392.39296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937 `" && echo ansible-tmp-1726855392.3927512-36076-24938552415937="` echo /root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937 `" ) && sleep 0' 30582 1726855392.39739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.39750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855392.39754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855392.39756: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855392.39758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.39793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.39814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.39874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.41791: stdout chunk (state=3): >>>ansible-tmp-1726855392.3927512-36076-24938552415937=/root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937 <<< 30582 1726855392.41897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.41923: stderr chunk (state=3): >>><<< 30582 1726855392.41927: stdout chunk (state=3): >>><<< 30582 1726855392.41940: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855392.3927512-36076-24938552415937=/root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.41971: variable 'ansible_module_compression' from source: unknown 30582 1726855392.42015: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855392.42044: variable 'ansible_facts' from source: unknown 30582 1726855392.42104: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/AnsiballZ_command.py 30582 1726855392.42203: Sending initial data 30582 1726855392.42206: Sent initial data (155 bytes) 30582 1726855392.42645: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.42649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855392.42652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.42654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.42656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855392.42658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.42708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.42711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.42775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.44350: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855392.44410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855392.44469: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmprtc007wk /root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/AnsiballZ_command.py <<< 30582 1726855392.44472: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/AnsiballZ_command.py" <<< 30582 1726855392.44525: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmprtc007wk" to remote "/root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/AnsiballZ_command.py" <<< 30582 1726855392.44528: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/AnsiballZ_command.py" <<< 30582 1726855392.45118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.45161: stderr chunk (state=3): >>><<< 30582 1726855392.45164: stdout chunk (state=3): >>><<< 30582 1726855392.45208: done transferring module to remote 30582 1726855392.45217: _low_level_execute_command(): starting 30582 1726855392.45222: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/ /root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/AnsiballZ_command.py && sleep 0' 30582 1726855392.45669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.45673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855392.45679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.45681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.45683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.45732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.45738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855392.45740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.45797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.47600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.47628: stderr chunk (state=3): >>><<< 30582 1726855392.47631: stdout chunk (state=3): >>><<< 30582 1726855392.47644: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.47648: _low_level_execute_command(): starting 30582 1726855392.47653: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/AnsiballZ_command.py && sleep 0' 30582 1726855392.48093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.48097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.48112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.48164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.48167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.48239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.65106: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:03:12.633178", "end": "2024-09-20 14:03:12.650142", "delta": "0:00:00.016964", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855392.66631: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. <<< 30582 1726855392.66659: stderr chunk (state=3): >>><<< 30582 1726855392.66662: stdout chunk (state=3): >>><<< 30582 1726855392.66685: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 14:03:12.633178", "end": "2024-09-20 14:03:12.650142", "delta": "0:00:00.016964", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855392.66720: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855392.66727: _low_level_execute_command(): starting 30582 1726855392.66732: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855392.3927512-36076-24938552415937/ > /dev/null 2>&1 && sleep 0' 30582 1726855392.67184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.67189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855392.67192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.67194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.67247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.67251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855392.67256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.67319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.69171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.69205: stderr chunk (state=3): >>><<< 30582 1726855392.69208: stdout chunk (state=3): >>><<< 30582 1726855392.69221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.69228: handler run complete 30582 1726855392.69248: Evaluated conditional (False): False 30582 1726855392.69256: attempt loop complete, returning result 30582 1726855392.69259: _execute() done 30582 1726855392.69261: dumping result to json 30582 1726855392.69267: done dumping result, returning 30582 1726855392.69275: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcc66-ac2b-aa83-7d57-00000000288b] 30582 1726855392.69279: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288b 30582 1726855392.69382: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288b 30582 1726855392.69385: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.016964", "end": "2024-09-20 14:03:12.650142", "rc": 1, "start": "2024-09-20 14:03:12.633178" } MSG: non-zero return code ...ignoring 30582 1726855392.69455: no more pending results, returning what we have 30582 1726855392.69459: results queue empty 30582 1726855392.69460: checking for any_errors_fatal 30582 1726855392.69471: done checking for any_errors_fatal 30582 1726855392.69472: checking for max_fail_percentage 30582 1726855392.69474: done checking for max_fail_percentage 30582 1726855392.69475: checking to see if all hosts have failed and the running result is not ok 30582 1726855392.69475: done checking to see if all hosts have failed 30582 1726855392.69476: getting the remaining hosts for this loop 30582 1726855392.69478: done getting the remaining hosts for this loop 30582 1726855392.69482: getting the next task for host managed_node3 30582 1726855392.69492: done getting next task for host managed_node3 30582 1726855392.69495: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855392.69500: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855392.69504: getting variables 30582 1726855392.69505: in VariableManager get_vars() 30582 1726855392.69552: Calling all_inventory to load vars for managed_node3 30582 1726855392.69555: Calling groups_inventory to load vars for managed_node3 30582 1726855392.69558: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855392.69571: Calling all_plugins_play to load vars for managed_node3 30582 1726855392.69574: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855392.69576: Calling groups_plugins_play to load vars for managed_node3 30582 1726855392.70430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855392.71320: done with get_vars() 30582 1726855392.71342: done getting variables 30582 1726855392.71391: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 14:03:12 -0400 (0:00:00.362) 0:02:09.064 ****** 30582 1726855392.71416: entering _queue_task() for managed_node3/set_fact 30582 1726855392.71698: worker is 1 (out of 1 available) 30582 1726855392.71712: exiting _queue_task() for managed_node3/set_fact 30582 1726855392.71726: done queuing things up, now waiting for results queue to drain 30582 1726855392.71728: waiting for pending results... 30582 1726855392.71913: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30582 1726855392.71997: in run() - task 0affcc66-ac2b-aa83-7d57-00000000288c 30582 1726855392.72009: variable 'ansible_search_path' from source: unknown 30582 1726855392.72013: variable 'ansible_search_path' from source: unknown 30582 1726855392.72041: calling self._execute() 30582 1726855392.72119: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.72123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.72132: variable 'omit' from source: magic vars 30582 1726855392.72425: variable 'ansible_distribution_major_version' from source: facts 30582 1726855392.72434: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855392.72535: variable 'nm_profile_exists' from source: set_fact 30582 1726855392.72546: Evaluated conditional (nm_profile_exists.rc == 0): False 30582 1726855392.72548: when evaluation is False, skipping this task 30582 1726855392.72551: _execute() done 30582 1726855392.72554: dumping result to json 30582 1726855392.72556: done dumping result, returning 30582 1726855392.72565: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-aa83-7d57-00000000288c] 30582 1726855392.72572: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288c 30582 1726855392.72660: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288c 30582 1726855392.72663: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30582 1726855392.72709: no more pending results, returning what we have 30582 1726855392.72714: results queue empty 30582 1726855392.72715: checking for any_errors_fatal 30582 1726855392.72727: done checking for any_errors_fatal 30582 1726855392.72727: checking for max_fail_percentage 30582 1726855392.72730: done checking for max_fail_percentage 30582 1726855392.72731: checking to see if all hosts have failed and the running result is not ok 30582 1726855392.72732: done checking to see if all hosts have failed 30582 1726855392.72732: getting the remaining hosts for this loop 30582 1726855392.72734: done getting the remaining hosts for this loop 30582 1726855392.72738: getting the next task for host managed_node3 30582 1726855392.72749: done getting next task for host managed_node3 30582 1726855392.72751: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855392.72758: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855392.72763: getting variables 30582 1726855392.72764: in VariableManager get_vars() 30582 1726855392.72813: Calling all_inventory to load vars for managed_node3 30582 1726855392.72816: Calling groups_inventory to load vars for managed_node3 30582 1726855392.72819: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855392.72831: Calling all_plugins_play to load vars for managed_node3 30582 1726855392.72833: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855392.72836: Calling groups_plugins_play to load vars for managed_node3 30582 1726855392.73799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855392.74665: done with get_vars() 30582 1726855392.74685: done getting variables 30582 1726855392.74733: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855392.74825: variable 'profile' from source: play vars 30582 1726855392.74828: variable 'interface' from source: play vars 30582 1726855392.74873: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 14:03:12 -0400 (0:00:00.034) 0:02:09.098 ****** 30582 1726855392.74900: entering _queue_task() for managed_node3/command 30582 1726855392.75177: worker is 1 (out of 1 available) 30582 1726855392.75192: exiting _queue_task() for managed_node3/command 30582 1726855392.75207: done queuing things up, now waiting for results queue to drain 30582 1726855392.75208: waiting for pending results... 30582 1726855392.75401: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr 30582 1726855392.75480: in run() - task 0affcc66-ac2b-aa83-7d57-00000000288e 30582 1726855392.75495: variable 'ansible_search_path' from source: unknown 30582 1726855392.75499: variable 'ansible_search_path' from source: unknown 30582 1726855392.75526: calling self._execute() 30582 1726855392.75606: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.75610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.75619: variable 'omit' from source: magic vars 30582 1726855392.75896: variable 'ansible_distribution_major_version' from source: facts 30582 1726855392.75906: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855392.75997: variable 'profile_stat' from source: set_fact 30582 1726855392.76005: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855392.76009: when evaluation is False, skipping this task 30582 1726855392.76012: _execute() done 30582 1726855392.76014: dumping result to json 30582 1726855392.76017: done dumping result, returning 30582 1726855392.76024: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000288e] 30582 1726855392.76029: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288e 30582 1726855392.76112: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288e 30582 1726855392.76114: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855392.76163: no more pending results, returning what we have 30582 1726855392.76168: results queue empty 30582 1726855392.76169: checking for any_errors_fatal 30582 1726855392.76177: done checking for any_errors_fatal 30582 1726855392.76178: checking for max_fail_percentage 30582 1726855392.76180: done checking for max_fail_percentage 30582 1726855392.76181: checking to see if all hosts have failed and the running result is not ok 30582 1726855392.76182: done checking to see if all hosts have failed 30582 1726855392.76182: getting the remaining hosts for this loop 30582 1726855392.76184: done getting the remaining hosts for this loop 30582 1726855392.76190: getting the next task for host managed_node3 30582 1726855392.76198: done getting next task for host managed_node3 30582 1726855392.76200: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30582 1726855392.76205: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855392.76211: getting variables 30582 1726855392.76213: in VariableManager get_vars() 30582 1726855392.76262: Calling all_inventory to load vars for managed_node3 30582 1726855392.76265: Calling groups_inventory to load vars for managed_node3 30582 1726855392.76268: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855392.76279: Calling all_plugins_play to load vars for managed_node3 30582 1726855392.76282: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855392.76284: Calling groups_plugins_play to load vars for managed_node3 30582 1726855392.77110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855392.78112: done with get_vars() 30582 1726855392.78129: done getting variables 30582 1726855392.78172: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855392.78254: variable 'profile' from source: play vars 30582 1726855392.78257: variable 'interface' from source: play vars 30582 1726855392.78299: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 14:03:12 -0400 (0:00:00.034) 0:02:09.133 ****** 30582 1726855392.78322: entering _queue_task() for managed_node3/set_fact 30582 1726855392.78576: worker is 1 (out of 1 available) 30582 1726855392.78591: exiting _queue_task() for managed_node3/set_fact 30582 1726855392.78605: done queuing things up, now waiting for results queue to drain 30582 1726855392.78606: waiting for pending results... 30582 1726855392.78793: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr 30582 1726855392.78884: in run() - task 0affcc66-ac2b-aa83-7d57-00000000288f 30582 1726855392.78902: variable 'ansible_search_path' from source: unknown 30582 1726855392.78905: variable 'ansible_search_path' from source: unknown 30582 1726855392.78935: calling self._execute() 30582 1726855392.79011: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.79015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.79024: variable 'omit' from source: magic vars 30582 1726855392.79302: variable 'ansible_distribution_major_version' from source: facts 30582 1726855392.79312: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855392.79397: variable 'profile_stat' from source: set_fact 30582 1726855392.79406: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855392.79410: when evaluation is False, skipping this task 30582 1726855392.79412: _execute() done 30582 1726855392.79415: dumping result to json 30582 1726855392.79417: done dumping result, returning 30582 1726855392.79425: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-00000000288f] 30582 1726855392.79429: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288f 30582 1726855392.79519: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000288f 30582 1726855392.79523: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855392.79572: no more pending results, returning what we have 30582 1726855392.79577: results queue empty 30582 1726855392.79578: checking for any_errors_fatal 30582 1726855392.79586: done checking for any_errors_fatal 30582 1726855392.79589: checking for max_fail_percentage 30582 1726855392.79591: done checking for max_fail_percentage 30582 1726855392.79592: checking to see if all hosts have failed and the running result is not ok 30582 1726855392.79593: done checking to see if all hosts have failed 30582 1726855392.79593: getting the remaining hosts for this loop 30582 1726855392.79595: done getting the remaining hosts for this loop 30582 1726855392.79599: getting the next task for host managed_node3 30582 1726855392.79606: done getting next task for host managed_node3 30582 1726855392.79609: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30582 1726855392.79614: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855392.79619: getting variables 30582 1726855392.79620: in VariableManager get_vars() 30582 1726855392.79672: Calling all_inventory to load vars for managed_node3 30582 1726855392.79675: Calling groups_inventory to load vars for managed_node3 30582 1726855392.79678: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855392.79691: Calling all_plugins_play to load vars for managed_node3 30582 1726855392.79694: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855392.79697: Calling groups_plugins_play to load vars for managed_node3 30582 1726855392.80522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855392.81410: done with get_vars() 30582 1726855392.81432: done getting variables 30582 1726855392.81482: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855392.81568: variable 'profile' from source: play vars 30582 1726855392.81571: variable 'interface' from source: play vars 30582 1726855392.81614: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 14:03:12 -0400 (0:00:00.033) 0:02:09.166 ****** 30582 1726855392.81639: entering _queue_task() for managed_node3/command 30582 1726855392.81922: worker is 1 (out of 1 available) 30582 1726855392.81938: exiting _queue_task() for managed_node3/command 30582 1726855392.81951: done queuing things up, now waiting for results queue to drain 30582 1726855392.81952: waiting for pending results... 30582 1726855392.82135: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr 30582 1726855392.82227: in run() - task 0affcc66-ac2b-aa83-7d57-000000002890 30582 1726855392.82238: variable 'ansible_search_path' from source: unknown 30582 1726855392.82241: variable 'ansible_search_path' from source: unknown 30582 1726855392.82272: calling self._execute() 30582 1726855392.82346: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.82350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.82358: variable 'omit' from source: magic vars 30582 1726855392.82629: variable 'ansible_distribution_major_version' from source: facts 30582 1726855392.82640: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855392.82724: variable 'profile_stat' from source: set_fact 30582 1726855392.82733: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855392.82737: when evaluation is False, skipping this task 30582 1726855392.82740: _execute() done 30582 1726855392.82742: dumping result to json 30582 1726855392.82745: done dumping result, returning 30582 1726855392.82751: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000002890] 30582 1726855392.82756: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002890 30582 1726855392.82843: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002890 30582 1726855392.82845: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855392.82897: no more pending results, returning what we have 30582 1726855392.82901: results queue empty 30582 1726855392.82902: checking for any_errors_fatal 30582 1726855392.82911: done checking for any_errors_fatal 30582 1726855392.82911: checking for max_fail_percentage 30582 1726855392.82914: done checking for max_fail_percentage 30582 1726855392.82914: checking to see if all hosts have failed and the running result is not ok 30582 1726855392.82915: done checking to see if all hosts have failed 30582 1726855392.82916: getting the remaining hosts for this loop 30582 1726855392.82917: done getting the remaining hosts for this loop 30582 1726855392.82921: getting the next task for host managed_node3 30582 1726855392.82930: done getting next task for host managed_node3 30582 1726855392.82932: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30582 1726855392.82938: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855392.82943: getting variables 30582 1726855392.82945: in VariableManager get_vars() 30582 1726855392.82996: Calling all_inventory to load vars for managed_node3 30582 1726855392.82999: Calling groups_inventory to load vars for managed_node3 30582 1726855392.83002: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855392.83014: Calling all_plugins_play to load vars for managed_node3 30582 1726855392.83017: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855392.83019: Calling groups_plugins_play to load vars for managed_node3 30582 1726855392.84032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855392.84904: done with get_vars() 30582 1726855392.84921: done getting variables 30582 1726855392.84967: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855392.85048: variable 'profile' from source: play vars 30582 1726855392.85052: variable 'interface' from source: play vars 30582 1726855392.85095: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 14:03:12 -0400 (0:00:00.034) 0:02:09.201 ****** 30582 1726855392.85119: entering _queue_task() for managed_node3/set_fact 30582 1726855392.85384: worker is 1 (out of 1 available) 30582 1726855392.85400: exiting _queue_task() for managed_node3/set_fact 30582 1726855392.85413: done queuing things up, now waiting for results queue to drain 30582 1726855392.85414: waiting for pending results... 30582 1726855392.85597: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr 30582 1726855392.85686: in run() - task 0affcc66-ac2b-aa83-7d57-000000002891 30582 1726855392.85700: variable 'ansible_search_path' from source: unknown 30582 1726855392.85704: variable 'ansible_search_path' from source: unknown 30582 1726855392.85732: calling self._execute() 30582 1726855392.85809: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.85813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.85821: variable 'omit' from source: magic vars 30582 1726855392.86094: variable 'ansible_distribution_major_version' from source: facts 30582 1726855392.86104: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855392.86188: variable 'profile_stat' from source: set_fact 30582 1726855392.86198: Evaluated conditional (profile_stat.stat.exists): False 30582 1726855392.86201: when evaluation is False, skipping this task 30582 1726855392.86204: _execute() done 30582 1726855392.86206: dumping result to json 30582 1726855392.86209: done dumping result, returning 30582 1726855392.86217: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-statebr [0affcc66-ac2b-aa83-7d57-000000002891] 30582 1726855392.86221: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002891 30582 1726855392.86311: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002891 30582 1726855392.86314: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30582 1726855392.86360: no more pending results, returning what we have 30582 1726855392.86367: results queue empty 30582 1726855392.86368: checking for any_errors_fatal 30582 1726855392.86376: done checking for any_errors_fatal 30582 1726855392.86376: checking for max_fail_percentage 30582 1726855392.86378: done checking for max_fail_percentage 30582 1726855392.86379: checking to see if all hosts have failed and the running result is not ok 30582 1726855392.86380: done checking to see if all hosts have failed 30582 1726855392.86380: getting the remaining hosts for this loop 30582 1726855392.86382: done getting the remaining hosts for this loop 30582 1726855392.86385: getting the next task for host managed_node3 30582 1726855392.86397: done getting next task for host managed_node3 30582 1726855392.86401: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30582 1726855392.86405: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855392.86410: getting variables 30582 1726855392.86411: in VariableManager get_vars() 30582 1726855392.86461: Calling all_inventory to load vars for managed_node3 30582 1726855392.86466: Calling groups_inventory to load vars for managed_node3 30582 1726855392.86469: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855392.86480: Calling all_plugins_play to load vars for managed_node3 30582 1726855392.86483: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855392.86485: Calling groups_plugins_play to load vars for managed_node3 30582 1726855392.87308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855392.88196: done with get_vars() 30582 1726855392.88215: done getting variables 30582 1726855392.88260: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855392.88345: variable 'profile' from source: play vars 30582 1726855392.88348: variable 'interface' from source: play vars 30582 1726855392.88393: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 14:03:12 -0400 (0:00:00.032) 0:02:09.234 ****** 30582 1726855392.88418: entering _queue_task() for managed_node3/assert 30582 1726855392.88689: worker is 1 (out of 1 available) 30582 1726855392.88703: exiting _queue_task() for managed_node3/assert 30582 1726855392.88715: done queuing things up, now waiting for results queue to drain 30582 1726855392.88717: waiting for pending results... 30582 1726855392.88900: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' 30582 1726855392.88980: in run() - task 0affcc66-ac2b-aa83-7d57-000000002805 30582 1726855392.88997: variable 'ansible_search_path' from source: unknown 30582 1726855392.89001: variable 'ansible_search_path' from source: unknown 30582 1726855392.89028: calling self._execute() 30582 1726855392.89109: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.89112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.89121: variable 'omit' from source: magic vars 30582 1726855392.89404: variable 'ansible_distribution_major_version' from source: facts 30582 1726855392.89414: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855392.89420: variable 'omit' from source: magic vars 30582 1726855392.89454: variable 'omit' from source: magic vars 30582 1726855392.89530: variable 'profile' from source: play vars 30582 1726855392.89534: variable 'interface' from source: play vars 30582 1726855392.89581: variable 'interface' from source: play vars 30582 1726855392.89601: variable 'omit' from source: magic vars 30582 1726855392.89633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855392.89659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855392.89678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855392.89693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855392.89706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855392.89731: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855392.89735: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.89737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.89812: Set connection var ansible_timeout to 10 30582 1726855392.89816: Set connection var ansible_connection to ssh 30582 1726855392.89822: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855392.89825: Set connection var ansible_pipelining to False 30582 1726855392.89830: Set connection var ansible_shell_executable to /bin/sh 30582 1726855392.89833: Set connection var ansible_shell_type to sh 30582 1726855392.89851: variable 'ansible_shell_executable' from source: unknown 30582 1726855392.89854: variable 'ansible_connection' from source: unknown 30582 1726855392.89856: variable 'ansible_module_compression' from source: unknown 30582 1726855392.89859: variable 'ansible_shell_type' from source: unknown 30582 1726855392.89861: variable 'ansible_shell_executable' from source: unknown 30582 1726855392.89863: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.89869: variable 'ansible_pipelining' from source: unknown 30582 1726855392.89872: variable 'ansible_timeout' from source: unknown 30582 1726855392.89874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.89980: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855392.89991: variable 'omit' from source: magic vars 30582 1726855392.89996: starting attempt loop 30582 1726855392.89999: running the handler 30582 1726855392.90090: variable 'lsr_net_profile_exists' from source: set_fact 30582 1726855392.90094: Evaluated conditional (not lsr_net_profile_exists): True 30582 1726855392.90100: handler run complete 30582 1726855392.90112: attempt loop complete, returning result 30582 1726855392.90114: _execute() done 30582 1726855392.90117: dumping result to json 30582 1726855392.90120: done dumping result, returning 30582 1726855392.90126: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'statebr' [0affcc66-ac2b-aa83-7d57-000000002805] 30582 1726855392.90132: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002805 30582 1726855392.90216: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002805 30582 1726855392.90219: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855392.90289: no more pending results, returning what we have 30582 1726855392.90293: results queue empty 30582 1726855392.90294: checking for any_errors_fatal 30582 1726855392.90302: done checking for any_errors_fatal 30582 1726855392.90302: checking for max_fail_percentage 30582 1726855392.90304: done checking for max_fail_percentage 30582 1726855392.90305: checking to see if all hosts have failed and the running result is not ok 30582 1726855392.90306: done checking to see if all hosts have failed 30582 1726855392.90306: getting the remaining hosts for this loop 30582 1726855392.90308: done getting the remaining hosts for this loop 30582 1726855392.90311: getting the next task for host managed_node3 30582 1726855392.90322: done getting next task for host managed_node3 30582 1726855392.90325: ^ task is: TASK: Get NetworkManager RPM version 30582 1726855392.90329: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855392.90334: getting variables 30582 1726855392.90335: in VariableManager get_vars() 30582 1726855392.90383: Calling all_inventory to load vars for managed_node3 30582 1726855392.90385: Calling groups_inventory to load vars for managed_node3 30582 1726855392.90390: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855392.90401: Calling all_plugins_play to load vars for managed_node3 30582 1726855392.90404: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855392.90406: Calling groups_plugins_play to load vars for managed_node3 30582 1726855392.91412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855392.92277: done with get_vars() 30582 1726855392.92300: done getting variables 30582 1726855392.92346: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NetworkManager RPM version] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:7 Friday 20 September 2024 14:03:12 -0400 (0:00:00.039) 0:02:09.273 ****** 30582 1726855392.92377: entering _queue_task() for managed_node3/command 30582 1726855392.92647: worker is 1 (out of 1 available) 30582 1726855392.92662: exiting _queue_task() for managed_node3/command 30582 1726855392.92676: done queuing things up, now waiting for results queue to drain 30582 1726855392.92678: waiting for pending results... 30582 1726855392.92866: running TaskExecutor() for managed_node3/TASK: Get NetworkManager RPM version 30582 1726855392.92948: in run() - task 0affcc66-ac2b-aa83-7d57-000000002809 30582 1726855392.92959: variable 'ansible_search_path' from source: unknown 30582 1726855392.92963: variable 'ansible_search_path' from source: unknown 30582 1726855392.92994: calling self._execute() 30582 1726855392.93075: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.93078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.93089: variable 'omit' from source: magic vars 30582 1726855392.93379: variable 'ansible_distribution_major_version' from source: facts 30582 1726855392.93390: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855392.93397: variable 'omit' from source: magic vars 30582 1726855392.93430: variable 'omit' from source: magic vars 30582 1726855392.93456: variable 'omit' from source: magic vars 30582 1726855392.93493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855392.93519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855392.93536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855392.93550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855392.93562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855392.93590: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855392.93594: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.93596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.93670: Set connection var ansible_timeout to 10 30582 1726855392.93673: Set connection var ansible_connection to ssh 30582 1726855392.93680: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855392.93685: Set connection var ansible_pipelining to False 30582 1726855392.93691: Set connection var ansible_shell_executable to /bin/sh 30582 1726855392.93694: Set connection var ansible_shell_type to sh 30582 1726855392.93712: variable 'ansible_shell_executable' from source: unknown 30582 1726855392.93714: variable 'ansible_connection' from source: unknown 30582 1726855392.93717: variable 'ansible_module_compression' from source: unknown 30582 1726855392.93719: variable 'ansible_shell_type' from source: unknown 30582 1726855392.93722: variable 'ansible_shell_executable' from source: unknown 30582 1726855392.93724: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855392.93728: variable 'ansible_pipelining' from source: unknown 30582 1726855392.93730: variable 'ansible_timeout' from source: unknown 30582 1726855392.93734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855392.93840: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855392.93849: variable 'omit' from source: magic vars 30582 1726855392.93855: starting attempt loop 30582 1726855392.93858: running the handler 30582 1726855392.93874: _low_level_execute_command(): starting 30582 1726855392.93888: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855392.94401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.94406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855392.94409: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855392.94411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.94462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855392.94468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855392.94470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.94543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.96230: stdout chunk (state=3): >>>/root <<< 30582 1726855392.96321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.96353: stderr chunk (state=3): >>><<< 30582 1726855392.96357: stdout chunk (state=3): >>><<< 30582 1726855392.96383: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.96396: _low_level_execute_command(): starting 30582 1726855392.96403: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670 `" && echo ansible-tmp-1726855392.963815-36090-14048909488670="` echo /root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670 `" ) && sleep 0' 30582 1726855392.97046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855392.97084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855392.97184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855392.99102: stdout chunk (state=3): >>>ansible-tmp-1726855392.963815-36090-14048909488670=/root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670 <<< 30582 1726855392.99213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855392.99239: stderr chunk (state=3): >>><<< 30582 1726855392.99242: stdout chunk (state=3): >>><<< 30582 1726855392.99259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855392.963815-36090-14048909488670=/root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855392.99291: variable 'ansible_module_compression' from source: unknown 30582 1726855392.99332: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855392.99362: variable 'ansible_facts' from source: unknown 30582 1726855392.99424: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/AnsiballZ_command.py 30582 1726855392.99527: Sending initial data 30582 1726855392.99530: Sent initial data (154 bytes) 30582 1726855392.99969: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855392.99972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855392.99975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855392.99977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855392.99980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.00033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.00039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855393.00042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.00097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855393.01675: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855393.01733: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855393.01794: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmphiy73d14 /root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/AnsiballZ_command.py <<< 30582 1726855393.01800: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/AnsiballZ_command.py" <<< 30582 1726855393.01853: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmphiy73d14" to remote "/root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/AnsiballZ_command.py" <<< 30582 1726855393.01858: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/AnsiballZ_command.py" <<< 30582 1726855393.02443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855393.02485: stderr chunk (state=3): >>><<< 30582 1726855393.02490: stdout chunk (state=3): >>><<< 30582 1726855393.02522: done transferring module to remote 30582 1726855393.02532: _low_level_execute_command(): starting 30582 1726855393.02537: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/ /root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/AnsiballZ_command.py && sleep 0' 30582 1726855393.02985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855393.02991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.02994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855393.02996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.03044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.03047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855393.03052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.03112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855393.04921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855393.04944: stderr chunk (state=3): >>><<< 30582 1726855393.04947: stdout chunk (state=3): >>><<< 30582 1726855393.04965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855393.04968: _low_level_execute_command(): starting 30582 1726855393.04971: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/AnsiballZ_command.py && sleep 0' 30582 1726855393.05433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855393.05437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.05439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855393.05442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855393.05444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.05494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.05498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855393.05514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.05579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855393.37186: stdout chunk (state=3): >>> {"changed": true, "stdout": "NetworkManager-1.48.10-1.el10", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 14:03:13.207251", "end": "2024-09-20 14:03:13.370933", "delta": "0:00:00.163682", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855393.38812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855393.38839: stderr chunk (state=3): >>><<< 30582 1726855393.38842: stdout chunk (state=3): >>><<< 30582 1726855393.38858: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "NetworkManager-1.48.10-1.el10", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 14:03:13.207251", "end": "2024-09-20 14:03:13.370933", "delta": "0:00:00.163682", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855393.38896: done with _execute_module (ansible.legacy.command, {'_raw_params': "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855393.38903: _low_level_execute_command(): starting 30582 1726855393.38908: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855392.963815-36090-14048909488670/ > /dev/null 2>&1 && sleep 0' 30582 1726855393.39366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855393.39370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.39373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855393.39375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855393.39381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.39426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.39429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855393.39431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.39496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855393.41326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855393.41351: stderr chunk (state=3): >>><<< 30582 1726855393.41354: stdout chunk (state=3): >>><<< 30582 1726855393.41368: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855393.41376: handler run complete 30582 1726855393.41397: Evaluated conditional (False): False 30582 1726855393.41407: attempt loop complete, returning result 30582 1726855393.41411: _execute() done 30582 1726855393.41414: dumping result to json 30582 1726855393.41416: done dumping result, returning 30582 1726855393.41425: done running TaskExecutor() for managed_node3/TASK: Get NetworkManager RPM version [0affcc66-ac2b-aa83-7d57-000000002809] 30582 1726855393.41430: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002809 30582 1726855393.41532: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002809 30582 1726855393.41535: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager" ], "delta": "0:00:00.163682", "end": "2024-09-20 14:03:13.370933", "rc": 0, "start": "2024-09-20 14:03:13.207251" } STDOUT: NetworkManager-1.48.10-1.el10 30582 1726855393.41610: no more pending results, returning what we have 30582 1726855393.41614: results queue empty 30582 1726855393.41615: checking for any_errors_fatal 30582 1726855393.41623: done checking for any_errors_fatal 30582 1726855393.41624: checking for max_fail_percentage 30582 1726855393.41625: done checking for max_fail_percentage 30582 1726855393.41626: checking to see if all hosts have failed and the running result is not ok 30582 1726855393.41627: done checking to see if all hosts have failed 30582 1726855393.41628: getting the remaining hosts for this loop 30582 1726855393.41629: done getting the remaining hosts for this loop 30582 1726855393.41633: getting the next task for host managed_node3 30582 1726855393.41640: done getting next task for host managed_node3 30582 1726855393.41642: ^ task is: TASK: Store NetworkManager version 30582 1726855393.41650: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855393.41655: getting variables 30582 1726855393.41656: in VariableManager get_vars() 30582 1726855393.41705: Calling all_inventory to load vars for managed_node3 30582 1726855393.41707: Calling groups_inventory to load vars for managed_node3 30582 1726855393.41710: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855393.41721: Calling all_plugins_play to load vars for managed_node3 30582 1726855393.41724: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855393.41726: Calling groups_plugins_play to load vars for managed_node3 30582 1726855393.42570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855393.43451: done with get_vars() 30582 1726855393.43475: done getting variables 30582 1726855393.43522: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Store NetworkManager version] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:14 Friday 20 September 2024 14:03:13 -0400 (0:00:00.511) 0:02:09.785 ****** 30582 1726855393.43547: entering _queue_task() for managed_node3/set_fact 30582 1726855393.43816: worker is 1 (out of 1 available) 30582 1726855393.43833: exiting _queue_task() for managed_node3/set_fact 30582 1726855393.43846: done queuing things up, now waiting for results queue to drain 30582 1726855393.43848: waiting for pending results... 30582 1726855393.44030: running TaskExecutor() for managed_node3/TASK: Store NetworkManager version 30582 1726855393.44109: in run() - task 0affcc66-ac2b-aa83-7d57-00000000280a 30582 1726855393.44122: variable 'ansible_search_path' from source: unknown 30582 1726855393.44126: variable 'ansible_search_path' from source: unknown 30582 1726855393.44154: calling self._execute() 30582 1726855393.44232: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.44236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.44244: variable 'omit' from source: magic vars 30582 1726855393.44521: variable 'ansible_distribution_major_version' from source: facts 30582 1726855393.44531: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855393.44537: variable 'omit' from source: magic vars 30582 1726855393.44573: variable 'omit' from source: magic vars 30582 1726855393.44652: variable '__rpm_q_networkmanager' from source: set_fact 30582 1726855393.44669: variable 'omit' from source: magic vars 30582 1726855393.44704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855393.44733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855393.44750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855393.44765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855393.44774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855393.44799: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855393.44803: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.44806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.44880: Set connection var ansible_timeout to 10 30582 1726855393.44883: Set connection var ansible_connection to ssh 30582 1726855393.44890: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855393.44895: Set connection var ansible_pipelining to False 30582 1726855393.44900: Set connection var ansible_shell_executable to /bin/sh 30582 1726855393.44902: Set connection var ansible_shell_type to sh 30582 1726855393.44919: variable 'ansible_shell_executable' from source: unknown 30582 1726855393.44922: variable 'ansible_connection' from source: unknown 30582 1726855393.44924: variable 'ansible_module_compression' from source: unknown 30582 1726855393.44927: variable 'ansible_shell_type' from source: unknown 30582 1726855393.44929: variable 'ansible_shell_executable' from source: unknown 30582 1726855393.44931: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.44935: variable 'ansible_pipelining' from source: unknown 30582 1726855393.44937: variable 'ansible_timeout' from source: unknown 30582 1726855393.44942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.45042: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855393.45054: variable 'omit' from source: magic vars 30582 1726855393.45057: starting attempt loop 30582 1726855393.45060: running the handler 30582 1726855393.45073: handler run complete 30582 1726855393.45081: attempt loop complete, returning result 30582 1726855393.45083: _execute() done 30582 1726855393.45088: dumping result to json 30582 1726855393.45091: done dumping result, returning 30582 1726855393.45097: done running TaskExecutor() for managed_node3/TASK: Store NetworkManager version [0affcc66-ac2b-aa83-7d57-00000000280a] 30582 1726855393.45102: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000280a 30582 1726855393.45188: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000280a 30582 1726855393.45191: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "networkmanager_nvr": "NetworkManager-1.48.10-1.el10" }, "changed": false } 30582 1726855393.45245: no more pending results, returning what we have 30582 1726855393.45248: results queue empty 30582 1726855393.45249: checking for any_errors_fatal 30582 1726855393.45259: done checking for any_errors_fatal 30582 1726855393.45260: checking for max_fail_percentage 30582 1726855393.45262: done checking for max_fail_percentage 30582 1726855393.45265: checking to see if all hosts have failed and the running result is not ok 30582 1726855393.45266: done checking to see if all hosts have failed 30582 1726855393.45266: getting the remaining hosts for this loop 30582 1726855393.45268: done getting the remaining hosts for this loop 30582 1726855393.45272: getting the next task for host managed_node3 30582 1726855393.45280: done getting next task for host managed_node3 30582 1726855393.45283: ^ task is: TASK: Show NetworkManager version 30582 1726855393.45286: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855393.45292: getting variables 30582 1726855393.45294: in VariableManager get_vars() 30582 1726855393.45341: Calling all_inventory to load vars for managed_node3 30582 1726855393.45344: Calling groups_inventory to load vars for managed_node3 30582 1726855393.45346: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855393.45357: Calling all_plugins_play to load vars for managed_node3 30582 1726855393.45359: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855393.45362: Calling groups_plugins_play to load vars for managed_node3 30582 1726855393.46324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855393.47199: done with get_vars() 30582 1726855393.47218: done getting variables 30582 1726855393.47262: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show NetworkManager version] ********************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:18 Friday 20 September 2024 14:03:13 -0400 (0:00:00.037) 0:02:09.822 ****** 30582 1726855393.47290: entering _queue_task() for managed_node3/debug 30582 1726855393.47557: worker is 1 (out of 1 available) 30582 1726855393.47574: exiting _queue_task() for managed_node3/debug 30582 1726855393.47586: done queuing things up, now waiting for results queue to drain 30582 1726855393.47590: waiting for pending results... 30582 1726855393.47774: running TaskExecutor() for managed_node3/TASK: Show NetworkManager version 30582 1726855393.47861: in run() - task 0affcc66-ac2b-aa83-7d57-00000000280b 30582 1726855393.47873: variable 'ansible_search_path' from source: unknown 30582 1726855393.47876: variable 'ansible_search_path' from source: unknown 30582 1726855393.47908: calling self._execute() 30582 1726855393.47982: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.47986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.47996: variable 'omit' from source: magic vars 30582 1726855393.48268: variable 'ansible_distribution_major_version' from source: facts 30582 1726855393.48276: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855393.48282: variable 'omit' from source: magic vars 30582 1726855393.48316: variable 'omit' from source: magic vars 30582 1726855393.48340: variable 'omit' from source: magic vars 30582 1726855393.48375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855393.48402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855393.48419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855393.48433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855393.48442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855393.48473: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855393.48476: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.48478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.48546: Set connection var ansible_timeout to 10 30582 1726855393.48549: Set connection var ansible_connection to ssh 30582 1726855393.48554: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855393.48559: Set connection var ansible_pipelining to False 30582 1726855393.48567: Set connection var ansible_shell_executable to /bin/sh 30582 1726855393.48569: Set connection var ansible_shell_type to sh 30582 1726855393.48589: variable 'ansible_shell_executable' from source: unknown 30582 1726855393.48592: variable 'ansible_connection' from source: unknown 30582 1726855393.48595: variable 'ansible_module_compression' from source: unknown 30582 1726855393.48597: variable 'ansible_shell_type' from source: unknown 30582 1726855393.48599: variable 'ansible_shell_executable' from source: unknown 30582 1726855393.48601: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.48606: variable 'ansible_pipelining' from source: unknown 30582 1726855393.48608: variable 'ansible_timeout' from source: unknown 30582 1726855393.48612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.48716: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855393.48725: variable 'omit' from source: magic vars 30582 1726855393.48730: starting attempt loop 30582 1726855393.48733: running the handler 30582 1726855393.48771: variable 'networkmanager_nvr' from source: set_fact 30582 1726855393.48832: variable 'networkmanager_nvr' from source: set_fact 30582 1726855393.48839: handler run complete 30582 1726855393.48853: attempt loop complete, returning result 30582 1726855393.48855: _execute() done 30582 1726855393.48858: dumping result to json 30582 1726855393.48860: done dumping result, returning 30582 1726855393.48868: done running TaskExecutor() for managed_node3/TASK: Show NetworkManager version [0affcc66-ac2b-aa83-7d57-00000000280b] 30582 1726855393.48873: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000280b 30582 1726855393.48960: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000280b 30582 1726855393.48965: WORKER PROCESS EXITING ok: [managed_node3] => { "networkmanager_nvr": "NetworkManager-1.48.10-1.el10" } 30582 1726855393.49010: no more pending results, returning what we have 30582 1726855393.49014: results queue empty 30582 1726855393.49015: checking for any_errors_fatal 30582 1726855393.49022: done checking for any_errors_fatal 30582 1726855393.49023: checking for max_fail_percentage 30582 1726855393.49025: done checking for max_fail_percentage 30582 1726855393.49026: checking to see if all hosts have failed and the running result is not ok 30582 1726855393.49026: done checking to see if all hosts have failed 30582 1726855393.49027: getting the remaining hosts for this loop 30582 1726855393.49029: done getting the remaining hosts for this loop 30582 1726855393.49032: getting the next task for host managed_node3 30582 1726855393.49041: done getting next task for host managed_node3 30582 1726855393.49045: ^ task is: TASK: Conditional asserts 30582 1726855393.49048: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855393.49054: getting variables 30582 1726855393.49056: in VariableManager get_vars() 30582 1726855393.49105: Calling all_inventory to load vars for managed_node3 30582 1726855393.49108: Calling groups_inventory to load vars for managed_node3 30582 1726855393.49111: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855393.49121: Calling all_plugins_play to load vars for managed_node3 30582 1726855393.49123: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855393.49126: Calling groups_plugins_play to load vars for managed_node3 30582 1726855393.49941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855393.50937: done with get_vars() 30582 1726855393.50953: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 14:03:13 -0400 (0:00:00.037) 0:02:09.860 ****** 30582 1726855393.51025: entering _queue_task() for managed_node3/include_tasks 30582 1726855393.51291: worker is 1 (out of 1 available) 30582 1726855393.51304: exiting _queue_task() for managed_node3/include_tasks 30582 1726855393.51317: done queuing things up, now waiting for results queue to drain 30582 1726855393.51319: waiting for pending results... 30582 1726855393.51505: running TaskExecutor() for managed_node3/TASK: Conditional asserts 30582 1726855393.51578: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020b3 30582 1726855393.51591: variable 'ansible_search_path' from source: unknown 30582 1726855393.51595: variable 'ansible_search_path' from source: unknown 30582 1726855393.51807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30582 1726855393.53297: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30582 1726855393.53343: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30582 1726855393.53373: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30582 1726855393.53401: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30582 1726855393.53423: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30582 1726855393.53494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30582 1726855393.53515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30582 1726855393.53533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30582 1726855393.53558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30582 1726855393.53570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30582 1726855393.53649: variable 'lsr_assert_when' from source: include params 30582 1726855393.53734: variable 'network_provider' from source: set_fact 30582 1726855393.53794: variable 'omit' from source: magic vars 30582 1726855393.53873: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.53882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.53892: variable 'omit' from source: magic vars 30582 1726855393.54027: variable 'ansible_distribution_major_version' from source: facts 30582 1726855393.54034: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855393.54109: variable 'item' from source: unknown 30582 1726855393.54116: Evaluated conditional (item['condition']): True 30582 1726855393.54169: variable 'item' from source: unknown 30582 1726855393.54192: variable 'item' from source: unknown 30582 1726855393.54240: variable 'item' from source: unknown 30582 1726855393.54384: dumping result to json 30582 1726855393.54389: done dumping result, returning 30582 1726855393.54391: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcc66-ac2b-aa83-7d57-0000000020b3] 30582 1726855393.54393: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b3 30582 1726855393.54428: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b3 30582 1726855393.54430: WORKER PROCESS EXITING 30582 1726855393.54454: no more pending results, returning what we have 30582 1726855393.54459: in VariableManager get_vars() 30582 1726855393.54513: Calling all_inventory to load vars for managed_node3 30582 1726855393.54516: Calling groups_inventory to load vars for managed_node3 30582 1726855393.54519: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855393.54530: Calling all_plugins_play to load vars for managed_node3 30582 1726855393.54533: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855393.54535: Calling groups_plugins_play to load vars for managed_node3 30582 1726855393.55376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855393.56244: done with get_vars() 30582 1726855393.56265: variable 'ansible_search_path' from source: unknown 30582 1726855393.56266: variable 'ansible_search_path' from source: unknown 30582 1726855393.56297: we have included files to process 30582 1726855393.56298: generating all_blocks data 30582 1726855393.56300: done generating all_blocks data 30582 1726855393.56304: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855393.56305: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855393.56306: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30582 1726855393.56386: in VariableManager get_vars() 30582 1726855393.56405: done with get_vars() 30582 1726855393.56483: done processing included file 30582 1726855393.56485: iterating over new_blocks loaded from include file 30582 1726855393.56486: in VariableManager get_vars() 30582 1726855393.56499: done with get_vars() 30582 1726855393.56500: filtering new block on tags 30582 1726855393.56522: done filtering new block on tags 30582 1726855393.56523: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30582 1726855393.56527: extending task lists for all hosts with included blocks 30582 1726855393.57304: done extending task lists 30582 1726855393.57305: done processing included files 30582 1726855393.57306: results queue empty 30582 1726855393.57306: checking for any_errors_fatal 30582 1726855393.57309: done checking for any_errors_fatal 30582 1726855393.57309: checking for max_fail_percentage 30582 1726855393.57310: done checking for max_fail_percentage 30582 1726855393.57311: checking to see if all hosts have failed and the running result is not ok 30582 1726855393.57311: done checking to see if all hosts have failed 30582 1726855393.57312: getting the remaining hosts for this loop 30582 1726855393.57313: done getting the remaining hosts for this loop 30582 1726855393.57314: getting the next task for host managed_node3 30582 1726855393.57317: done getting next task for host managed_node3 30582 1726855393.57319: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30582 1726855393.57321: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855393.57327: getting variables 30582 1726855393.57328: in VariableManager get_vars() 30582 1726855393.57337: Calling all_inventory to load vars for managed_node3 30582 1726855393.57338: Calling groups_inventory to load vars for managed_node3 30582 1726855393.57340: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855393.57344: Calling all_plugins_play to load vars for managed_node3 30582 1726855393.57346: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855393.57348: Calling groups_plugins_play to load vars for managed_node3 30582 1726855393.58090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855393.58940: done with get_vars() 30582 1726855393.58955: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 14:03:13 -0400 (0:00:00.079) 0:02:09.940 ****** 30582 1726855393.59016: entering _queue_task() for managed_node3/include_tasks 30582 1726855393.59295: worker is 1 (out of 1 available) 30582 1726855393.59309: exiting _queue_task() for managed_node3/include_tasks 30582 1726855393.59320: done queuing things up, now waiting for results queue to drain 30582 1726855393.59322: waiting for pending results... 30582 1726855393.59504: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 30582 1726855393.59593: in run() - task 0affcc66-ac2b-aa83-7d57-0000000028d3 30582 1726855393.59606: variable 'ansible_search_path' from source: unknown 30582 1726855393.59609: variable 'ansible_search_path' from source: unknown 30582 1726855393.59635: calling self._execute() 30582 1726855393.59711: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.59714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.59722: variable 'omit' from source: magic vars 30582 1726855393.59994: variable 'ansible_distribution_major_version' from source: facts 30582 1726855393.60004: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855393.60009: _execute() done 30582 1726855393.60013: dumping result to json 30582 1726855393.60016: done dumping result, returning 30582 1726855393.60023: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-aa83-7d57-0000000028d3] 30582 1726855393.60028: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000028d3 30582 1726855393.60116: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000028d3 30582 1726855393.60119: WORKER PROCESS EXITING 30582 1726855393.60147: no more pending results, returning what we have 30582 1726855393.60152: in VariableManager get_vars() 30582 1726855393.60210: Calling all_inventory to load vars for managed_node3 30582 1726855393.60213: Calling groups_inventory to load vars for managed_node3 30582 1726855393.60216: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855393.60232: Calling all_plugins_play to load vars for managed_node3 30582 1726855393.60235: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855393.60238: Calling groups_plugins_play to load vars for managed_node3 30582 1726855393.61069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855393.61923: done with get_vars() 30582 1726855393.61938: variable 'ansible_search_path' from source: unknown 30582 1726855393.61939: variable 'ansible_search_path' from source: unknown 30582 1726855393.62042: variable 'item' from source: include params 30582 1726855393.62069: we have included files to process 30582 1726855393.62070: generating all_blocks data 30582 1726855393.62071: done generating all_blocks data 30582 1726855393.62072: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855393.62072: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855393.62074: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30582 1726855393.62201: done processing included file 30582 1726855393.62203: iterating over new_blocks loaded from include file 30582 1726855393.62204: in VariableManager get_vars() 30582 1726855393.62217: done with get_vars() 30582 1726855393.62218: filtering new block on tags 30582 1726855393.62235: done filtering new block on tags 30582 1726855393.62237: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 30582 1726855393.62241: extending task lists for all hosts with included blocks 30582 1726855393.62334: done extending task lists 30582 1726855393.62335: done processing included files 30582 1726855393.62336: results queue empty 30582 1726855393.62336: checking for any_errors_fatal 30582 1726855393.62339: done checking for any_errors_fatal 30582 1726855393.62340: checking for max_fail_percentage 30582 1726855393.62341: done checking for max_fail_percentage 30582 1726855393.62341: checking to see if all hosts have failed and the running result is not ok 30582 1726855393.62342: done checking to see if all hosts have failed 30582 1726855393.62342: getting the remaining hosts for this loop 30582 1726855393.62343: done getting the remaining hosts for this loop 30582 1726855393.62345: getting the next task for host managed_node3 30582 1726855393.62349: done getting next task for host managed_node3 30582 1726855393.62350: ^ task is: TASK: Get stat for interface {{ interface }} 30582 1726855393.62353: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855393.62354: getting variables 30582 1726855393.62355: in VariableManager get_vars() 30582 1726855393.62365: Calling all_inventory to load vars for managed_node3 30582 1726855393.62366: Calling groups_inventory to load vars for managed_node3 30582 1726855393.62368: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855393.62373: Calling all_plugins_play to load vars for managed_node3 30582 1726855393.62374: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855393.62376: Calling groups_plugins_play to load vars for managed_node3 30582 1726855393.68202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855393.69064: done with get_vars() 30582 1726855393.69088: done getting variables 30582 1726855393.69185: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 14:03:13 -0400 (0:00:00.101) 0:02:10.042 ****** 30582 1726855393.69207: entering _queue_task() for managed_node3/stat 30582 1726855393.69496: worker is 1 (out of 1 available) 30582 1726855393.69511: exiting _queue_task() for managed_node3/stat 30582 1726855393.69522: done queuing things up, now waiting for results queue to drain 30582 1726855393.69525: waiting for pending results... 30582 1726855393.69708: running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr 30582 1726855393.69808: in run() - task 0affcc66-ac2b-aa83-7d57-000000002979 30582 1726855393.69819: variable 'ansible_search_path' from source: unknown 30582 1726855393.69823: variable 'ansible_search_path' from source: unknown 30582 1726855393.69855: calling self._execute() 30582 1726855393.69930: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.69934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.69942: variable 'omit' from source: magic vars 30582 1726855393.70221: variable 'ansible_distribution_major_version' from source: facts 30582 1726855393.70231: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855393.70237: variable 'omit' from source: magic vars 30582 1726855393.70278: variable 'omit' from source: magic vars 30582 1726855393.70355: variable 'interface' from source: play vars 30582 1726855393.70369: variable 'omit' from source: magic vars 30582 1726855393.70407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855393.70433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855393.70451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855393.70468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855393.70477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855393.70503: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855393.70507: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.70510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.70580: Set connection var ansible_timeout to 10 30582 1726855393.70584: Set connection var ansible_connection to ssh 30582 1726855393.70590: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855393.70596: Set connection var ansible_pipelining to False 30582 1726855393.70601: Set connection var ansible_shell_executable to /bin/sh 30582 1726855393.70604: Set connection var ansible_shell_type to sh 30582 1726855393.70622: variable 'ansible_shell_executable' from source: unknown 30582 1726855393.70625: variable 'ansible_connection' from source: unknown 30582 1726855393.70628: variable 'ansible_module_compression' from source: unknown 30582 1726855393.70630: variable 'ansible_shell_type' from source: unknown 30582 1726855393.70633: variable 'ansible_shell_executable' from source: unknown 30582 1726855393.70636: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855393.70638: variable 'ansible_pipelining' from source: unknown 30582 1726855393.70640: variable 'ansible_timeout' from source: unknown 30582 1726855393.70643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855393.70789: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30582 1726855393.70800: variable 'omit' from source: magic vars 30582 1726855393.70805: starting attempt loop 30582 1726855393.70808: running the handler 30582 1726855393.70821: _low_level_execute_command(): starting 30582 1726855393.70828: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855393.71345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855393.71349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855393.71353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855393.71356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.71411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.71414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855393.71417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.71485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855393.73183: stdout chunk (state=3): >>>/root <<< 30582 1726855393.73282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855393.73314: stderr chunk (state=3): >>><<< 30582 1726855393.73317: stdout chunk (state=3): >>><<< 30582 1726855393.73339: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855393.73352: _low_level_execute_command(): starting 30582 1726855393.73358: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426 `" && echo ansible-tmp-1726855393.7333775-36105-149565011660426="` echo /root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426 `" ) && sleep 0' 30582 1726855393.73805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855393.73808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.73818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855393.73821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.73873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.73879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855393.73881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.73940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855393.75840: stdout chunk (state=3): >>>ansible-tmp-1726855393.7333775-36105-149565011660426=/root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426 <<< 30582 1726855393.75976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855393.75980: stdout chunk (state=3): >>><<< 30582 1726855393.75986: stderr chunk (state=3): >>><<< 30582 1726855393.76004: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855393.7333775-36105-149565011660426=/root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855393.76043: variable 'ansible_module_compression' from source: unknown 30582 1726855393.76096: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30582 1726855393.76127: variable 'ansible_facts' from source: unknown 30582 1726855393.76186: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/AnsiballZ_stat.py 30582 1726855393.76293: Sending initial data 30582 1726855393.76296: Sent initial data (153 bytes) 30582 1726855393.76737: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855393.76740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855393.76744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.76746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855393.76748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.76797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.76801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.76867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855393.78453: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855393.78457: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855393.78510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855393.78572: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp7h90q_xh /root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/AnsiballZ_stat.py <<< 30582 1726855393.78577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/AnsiballZ_stat.py" <<< 30582 1726855393.78631: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp7h90q_xh" to remote "/root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/AnsiballZ_stat.py" <<< 30582 1726855393.79225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855393.79270: stderr chunk (state=3): >>><<< 30582 1726855393.79273: stdout chunk (state=3): >>><<< 30582 1726855393.79298: done transferring module to remote 30582 1726855393.79307: _low_level_execute_command(): starting 30582 1726855393.79311: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/ /root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/AnsiballZ_stat.py && sleep 0' 30582 1726855393.79755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855393.79758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855393.79761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855393.79766: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855393.79774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.79807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.79821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.79893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855393.81675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855393.81701: stderr chunk (state=3): >>><<< 30582 1726855393.81708: stdout chunk (state=3): >>><<< 30582 1726855393.81723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855393.81726: _low_level_execute_command(): starting 30582 1726855393.81730: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/AnsiballZ_stat.py && sleep 0' 30582 1726855393.82175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855393.82179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.82182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855393.82184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855393.82186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.82236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.82240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855393.82245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.82310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855393.97475: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30582 1726855393.98831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855393.98875: stderr chunk (state=3): >>><<< 30582 1726855393.98879: stdout chunk (state=3): >>><<< 30582 1726855393.98901: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855393.98924: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855393.98933: _low_level_execute_command(): starting 30582 1726855393.98938: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855393.7333775-36105-149565011660426/ > /dev/null 2>&1 && sleep 0' 30582 1726855393.99382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855393.99386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.99390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855393.99392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855393.99446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855393.99449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855393.99454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855393.99515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.01400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.01424: stderr chunk (state=3): >>><<< 30582 1726855394.01429: stdout chunk (state=3): >>><<< 30582 1726855394.01442: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.01448: handler run complete 30582 1726855394.01466: attempt loop complete, returning result 30582 1726855394.01469: _execute() done 30582 1726855394.01472: dumping result to json 30582 1726855394.01473: done dumping result, returning 30582 1726855394.01481: done running TaskExecutor() for managed_node3/TASK: Get stat for interface statebr [0affcc66-ac2b-aa83-7d57-000000002979] 30582 1726855394.01485: sending task result for task 0affcc66-ac2b-aa83-7d57-000000002979 30582 1726855394.01582: done sending task result for task 0affcc66-ac2b-aa83-7d57-000000002979 30582 1726855394.01585: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 30582 1726855394.01641: no more pending results, returning what we have 30582 1726855394.01644: results queue empty 30582 1726855394.01645: checking for any_errors_fatal 30582 1726855394.01647: done checking for any_errors_fatal 30582 1726855394.01647: checking for max_fail_percentage 30582 1726855394.01650: done checking for max_fail_percentage 30582 1726855394.01650: checking to see if all hosts have failed and the running result is not ok 30582 1726855394.01651: done checking to see if all hosts have failed 30582 1726855394.01652: getting the remaining hosts for this loop 30582 1726855394.01653: done getting the remaining hosts for this loop 30582 1726855394.01657: getting the next task for host managed_node3 30582 1726855394.01669: done getting next task for host managed_node3 30582 1726855394.01672: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30582 1726855394.01676: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855394.01683: getting variables 30582 1726855394.01684: in VariableManager get_vars() 30582 1726855394.01734: Calling all_inventory to load vars for managed_node3 30582 1726855394.01737: Calling groups_inventory to load vars for managed_node3 30582 1726855394.01740: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855394.01752: Calling all_plugins_play to load vars for managed_node3 30582 1726855394.01755: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855394.01757: Calling groups_plugins_play to load vars for managed_node3 30582 1726855394.02607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855394.03605: done with get_vars() 30582 1726855394.03623: done getting variables 30582 1726855394.03671: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855394.03764: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 14:03:14 -0400 (0:00:00.345) 0:02:10.387 ****** 30582 1726855394.03792: entering _queue_task() for managed_node3/assert 30582 1726855394.04066: worker is 1 (out of 1 available) 30582 1726855394.04081: exiting _queue_task() for managed_node3/assert 30582 1726855394.04096: done queuing things up, now waiting for results queue to drain 30582 1726855394.04098: waiting for pending results... 30582 1726855394.04281: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' 30582 1726855394.04379: in run() - task 0affcc66-ac2b-aa83-7d57-0000000028d4 30582 1726855394.04393: variable 'ansible_search_path' from source: unknown 30582 1726855394.04397: variable 'ansible_search_path' from source: unknown 30582 1726855394.04428: calling self._execute() 30582 1726855394.04510: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.04514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.04523: variable 'omit' from source: magic vars 30582 1726855394.04804: variable 'ansible_distribution_major_version' from source: facts 30582 1726855394.04814: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855394.04819: variable 'omit' from source: magic vars 30582 1726855394.04848: variable 'omit' from source: magic vars 30582 1726855394.04923: variable 'interface' from source: play vars 30582 1726855394.04937: variable 'omit' from source: magic vars 30582 1726855394.04973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855394.05004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855394.05020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855394.05034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.05044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.05071: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855394.05075: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.05078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.05150: Set connection var ansible_timeout to 10 30582 1726855394.05153: Set connection var ansible_connection to ssh 30582 1726855394.05158: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855394.05163: Set connection var ansible_pipelining to False 30582 1726855394.05170: Set connection var ansible_shell_executable to /bin/sh 30582 1726855394.05173: Set connection var ansible_shell_type to sh 30582 1726855394.05191: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.05194: variable 'ansible_connection' from source: unknown 30582 1726855394.05198: variable 'ansible_module_compression' from source: unknown 30582 1726855394.05201: variable 'ansible_shell_type' from source: unknown 30582 1726855394.05203: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.05206: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.05208: variable 'ansible_pipelining' from source: unknown 30582 1726855394.05210: variable 'ansible_timeout' from source: unknown 30582 1726855394.05212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.05316: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855394.05329: variable 'omit' from source: magic vars 30582 1726855394.05335: starting attempt loop 30582 1726855394.05338: running the handler 30582 1726855394.05448: variable 'interface_stat' from source: set_fact 30582 1726855394.05456: Evaluated conditional (not interface_stat.stat.exists): True 30582 1726855394.05461: handler run complete 30582 1726855394.05476: attempt loop complete, returning result 30582 1726855394.05478: _execute() done 30582 1726855394.05481: dumping result to json 30582 1726855394.05484: done dumping result, returning 30582 1726855394.05491: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'statebr' [0affcc66-ac2b-aa83-7d57-0000000028d4] 30582 1726855394.05497: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000028d4 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 30582 1726855394.05641: no more pending results, returning what we have 30582 1726855394.05645: results queue empty 30582 1726855394.05646: checking for any_errors_fatal 30582 1726855394.05656: done checking for any_errors_fatal 30582 1726855394.05657: checking for max_fail_percentage 30582 1726855394.05659: done checking for max_fail_percentage 30582 1726855394.05660: checking to see if all hosts have failed and the running result is not ok 30582 1726855394.05661: done checking to see if all hosts have failed 30582 1726855394.05661: getting the remaining hosts for this loop 30582 1726855394.05663: done getting the remaining hosts for this loop 30582 1726855394.05667: getting the next task for host managed_node3 30582 1726855394.05677: done getting next task for host managed_node3 30582 1726855394.05680: ^ task is: TASK: Success in test '{{ lsr_description }}' 30582 1726855394.05682: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855394.05689: getting variables 30582 1726855394.05691: in VariableManager get_vars() 30582 1726855394.05740: Calling all_inventory to load vars for managed_node3 30582 1726855394.05743: Calling groups_inventory to load vars for managed_node3 30582 1726855394.05746: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855394.05757: Calling all_plugins_play to load vars for managed_node3 30582 1726855394.05760: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855394.05763: Calling groups_plugins_play to load vars for managed_node3 30582 1726855394.06613: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000028d4 30582 1726855394.06617: WORKER PROCESS EXITING 30582 1726855394.06628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855394.07503: done with get_vars() 30582 1726855394.07521: done getting variables 30582 1726855394.07564: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30582 1726855394.07652: variable 'lsr_description' from source: include params TASK [Success in test 'I will not get an error when I try to remove an absent profile'] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 14:03:14 -0400 (0:00:00.038) 0:02:10.426 ****** 30582 1726855394.07676: entering _queue_task() for managed_node3/debug 30582 1726855394.07935: worker is 1 (out of 1 available) 30582 1726855394.07950: exiting _queue_task() for managed_node3/debug 30582 1726855394.07961: done queuing things up, now waiting for results queue to drain 30582 1726855394.07963: waiting for pending results... 30582 1726855394.08141: running TaskExecutor() for managed_node3/TASK: Success in test 'I will not get an error when I try to remove an absent profile' 30582 1726855394.08219: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020b4 30582 1726855394.08231: variable 'ansible_search_path' from source: unknown 30582 1726855394.08235: variable 'ansible_search_path' from source: unknown 30582 1726855394.08264: calling self._execute() 30582 1726855394.08345: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.08349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.08358: variable 'omit' from source: magic vars 30582 1726855394.08642: variable 'ansible_distribution_major_version' from source: facts 30582 1726855394.08652: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855394.08658: variable 'omit' from source: magic vars 30582 1726855394.08691: variable 'omit' from source: magic vars 30582 1726855394.08763: variable 'lsr_description' from source: include params 30582 1726855394.08779: variable 'omit' from source: magic vars 30582 1726855394.08812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855394.08840: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855394.08858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855394.08875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.08885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.08911: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855394.08914: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.08917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.08991: Set connection var ansible_timeout to 10 30582 1726855394.08995: Set connection var ansible_connection to ssh 30582 1726855394.09001: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855394.09005: Set connection var ansible_pipelining to False 30582 1726855394.09010: Set connection var ansible_shell_executable to /bin/sh 30582 1726855394.09012: Set connection var ansible_shell_type to sh 30582 1726855394.09029: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.09032: variable 'ansible_connection' from source: unknown 30582 1726855394.09034: variable 'ansible_module_compression' from source: unknown 30582 1726855394.09036: variable 'ansible_shell_type' from source: unknown 30582 1726855394.09038: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.09040: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.09044: variable 'ansible_pipelining' from source: unknown 30582 1726855394.09049: variable 'ansible_timeout' from source: unknown 30582 1726855394.09052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.09151: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855394.09163: variable 'omit' from source: magic vars 30582 1726855394.09171: starting attempt loop 30582 1726855394.09174: running the handler 30582 1726855394.09213: handler run complete 30582 1726855394.09224: attempt loop complete, returning result 30582 1726855394.09226: _execute() done 30582 1726855394.09229: dumping result to json 30582 1726855394.09231: done dumping result, returning 30582 1726855394.09240: done running TaskExecutor() for managed_node3/TASK: Success in test 'I will not get an error when I try to remove an absent profile' [0affcc66-ac2b-aa83-7d57-0000000020b4] 30582 1726855394.09244: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b4 30582 1726855394.09325: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b4 30582 1726855394.09328: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'I will not get an error when I try to remove an absent profile' +++++ 30582 1726855394.09374: no more pending results, returning what we have 30582 1726855394.09378: results queue empty 30582 1726855394.09379: checking for any_errors_fatal 30582 1726855394.09389: done checking for any_errors_fatal 30582 1726855394.09389: checking for max_fail_percentage 30582 1726855394.09391: done checking for max_fail_percentage 30582 1726855394.09392: checking to see if all hosts have failed and the running result is not ok 30582 1726855394.09393: done checking to see if all hosts have failed 30582 1726855394.09394: getting the remaining hosts for this loop 30582 1726855394.09395: done getting the remaining hosts for this loop 30582 1726855394.09399: getting the next task for host managed_node3 30582 1726855394.09408: done getting next task for host managed_node3 30582 1726855394.09410: ^ task is: TASK: Cleanup 30582 1726855394.09413: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855394.09419: getting variables 30582 1726855394.09420: in VariableManager get_vars() 30582 1726855394.09467: Calling all_inventory to load vars for managed_node3 30582 1726855394.09471: Calling groups_inventory to load vars for managed_node3 30582 1726855394.09473: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855394.09484: Calling all_plugins_play to load vars for managed_node3 30582 1726855394.09486: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855394.09494: Calling groups_plugins_play to load vars for managed_node3 30582 1726855394.10474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855394.11324: done with get_vars() 30582 1726855394.11343: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 14:03:14 -0400 (0:00:00.037) 0:02:10.464 ****** 30582 1726855394.11415: entering _queue_task() for managed_node3/include_tasks 30582 1726855394.11676: worker is 1 (out of 1 available) 30582 1726855394.11691: exiting _queue_task() for managed_node3/include_tasks 30582 1726855394.11702: done queuing things up, now waiting for results queue to drain 30582 1726855394.11704: waiting for pending results... 30582 1726855394.11892: running TaskExecutor() for managed_node3/TASK: Cleanup 30582 1726855394.11966: in run() - task 0affcc66-ac2b-aa83-7d57-0000000020b8 30582 1726855394.11982: variable 'ansible_search_path' from source: unknown 30582 1726855394.11986: variable 'ansible_search_path' from source: unknown 30582 1726855394.12025: variable 'lsr_cleanup' from source: include params 30582 1726855394.12184: variable 'lsr_cleanup' from source: include params 30582 1726855394.12244: variable 'omit' from source: magic vars 30582 1726855394.12348: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.12355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.12363: variable 'omit' from source: magic vars 30582 1726855394.12540: variable 'ansible_distribution_major_version' from source: facts 30582 1726855394.12548: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855394.12554: variable 'item' from source: unknown 30582 1726855394.12607: variable 'item' from source: unknown 30582 1726855394.12629: variable 'item' from source: unknown 30582 1726855394.12676: variable 'item' from source: unknown 30582 1726855394.12808: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.12811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.12814: variable 'omit' from source: magic vars 30582 1726855394.12890: variable 'ansible_distribution_major_version' from source: facts 30582 1726855394.12893: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855394.12899: variable 'item' from source: unknown 30582 1726855394.12944: variable 'item' from source: unknown 30582 1726855394.12966: variable 'item' from source: unknown 30582 1726855394.13008: variable 'item' from source: unknown 30582 1726855394.13073: dumping result to json 30582 1726855394.13075: done dumping result, returning 30582 1726855394.13078: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcc66-ac2b-aa83-7d57-0000000020b8] 30582 1726855394.13080: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b8 30582 1726855394.13117: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000020b8 30582 1726855394.13120: WORKER PROCESS EXITING 30582 1726855394.13144: no more pending results, returning what we have 30582 1726855394.13149: in VariableManager get_vars() 30582 1726855394.13203: Calling all_inventory to load vars for managed_node3 30582 1726855394.13206: Calling groups_inventory to load vars for managed_node3 30582 1726855394.13209: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855394.13222: Calling all_plugins_play to load vars for managed_node3 30582 1726855394.13225: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855394.13227: Calling groups_plugins_play to load vars for managed_node3 30582 1726855394.14058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855394.14944: done with get_vars() 30582 1726855394.14960: variable 'ansible_search_path' from source: unknown 30582 1726855394.14961: variable 'ansible_search_path' from source: unknown 30582 1726855394.14994: variable 'ansible_search_path' from source: unknown 30582 1726855394.14995: variable 'ansible_search_path' from source: unknown 30582 1726855394.15012: we have included files to process 30582 1726855394.15013: generating all_blocks data 30582 1726855394.15014: done generating all_blocks data 30582 1726855394.15017: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855394.15018: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855394.15020: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30582 1726855394.15156: done processing included file 30582 1726855394.15158: iterating over new_blocks loaded from include file 30582 1726855394.15159: in VariableManager get_vars() 30582 1726855394.15173: done with get_vars() 30582 1726855394.15175: filtering new block on tags 30582 1726855394.15195: done filtering new block on tags 30582 1726855394.15197: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node3 => (item=tasks/cleanup_profile+device.yml) 30582 1726855394.15200: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30582 1726855394.15201: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30582 1726855394.15203: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30582 1726855394.15433: done processing included file 30582 1726855394.15434: iterating over new_blocks loaded from include file 30582 1726855394.15435: in VariableManager get_vars() 30582 1726855394.15446: done with get_vars() 30582 1726855394.15447: filtering new block on tags 30582 1726855394.15470: done filtering new block on tags 30582 1726855394.15472: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 => (item=tasks/check_network_dns.yml) 30582 1726855394.15474: extending task lists for all hosts with included blocks 30582 1726855394.16391: done extending task lists 30582 1726855394.16392: done processing included files 30582 1726855394.16393: results queue empty 30582 1726855394.16393: checking for any_errors_fatal 30582 1726855394.16396: done checking for any_errors_fatal 30582 1726855394.16397: checking for max_fail_percentage 30582 1726855394.16397: done checking for max_fail_percentage 30582 1726855394.16398: checking to see if all hosts have failed and the running result is not ok 30582 1726855394.16399: done checking to see if all hosts have failed 30582 1726855394.16399: getting the remaining hosts for this loop 30582 1726855394.16400: done getting the remaining hosts for this loop 30582 1726855394.16401: getting the next task for host managed_node3 30582 1726855394.16404: done getting next task for host managed_node3 30582 1726855394.16406: ^ task is: TASK: Cleanup profile and device 30582 1726855394.16408: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855394.16409: getting variables 30582 1726855394.16410: in VariableManager get_vars() 30582 1726855394.16418: Calling all_inventory to load vars for managed_node3 30582 1726855394.16424: Calling groups_inventory to load vars for managed_node3 30582 1726855394.16426: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855394.16431: Calling all_plugins_play to load vars for managed_node3 30582 1726855394.16433: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855394.16435: Calling groups_plugins_play to load vars for managed_node3 30582 1726855394.17120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855394.17973: done with get_vars() 30582 1726855394.17991: done getting variables 30582 1726855394.18020: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 14:03:14 -0400 (0:00:00.066) 0:02:10.530 ****** 30582 1726855394.18041: entering _queue_task() for managed_node3/shell 30582 1726855394.18320: worker is 1 (out of 1 available) 30582 1726855394.18334: exiting _queue_task() for managed_node3/shell 30582 1726855394.18345: done queuing things up, now waiting for results queue to drain 30582 1726855394.18347: waiting for pending results... 30582 1726855394.18530: running TaskExecutor() for managed_node3/TASK: Cleanup profile and device 30582 1726855394.18606: in run() - task 0affcc66-ac2b-aa83-7d57-00000000299e 30582 1726855394.18617: variable 'ansible_search_path' from source: unknown 30582 1726855394.18622: variable 'ansible_search_path' from source: unknown 30582 1726855394.18653: calling self._execute() 30582 1726855394.18728: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.18732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.18740: variable 'omit' from source: magic vars 30582 1726855394.19022: variable 'ansible_distribution_major_version' from source: facts 30582 1726855394.19031: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855394.19037: variable 'omit' from source: magic vars 30582 1726855394.19067: variable 'omit' from source: magic vars 30582 1726855394.19172: variable 'interface' from source: play vars 30582 1726855394.19186: variable 'omit' from source: magic vars 30582 1726855394.19220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855394.19248: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855394.19267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855394.19279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.19291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.19315: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855394.19318: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.19320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.19396: Set connection var ansible_timeout to 10 30582 1726855394.19399: Set connection var ansible_connection to ssh 30582 1726855394.19405: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855394.19409: Set connection var ansible_pipelining to False 30582 1726855394.19414: Set connection var ansible_shell_executable to /bin/sh 30582 1726855394.19417: Set connection var ansible_shell_type to sh 30582 1726855394.19434: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.19437: variable 'ansible_connection' from source: unknown 30582 1726855394.19441: variable 'ansible_module_compression' from source: unknown 30582 1726855394.19444: variable 'ansible_shell_type' from source: unknown 30582 1726855394.19446: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.19448: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.19451: variable 'ansible_pipelining' from source: unknown 30582 1726855394.19454: variable 'ansible_timeout' from source: unknown 30582 1726855394.19456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.19554: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855394.19568: variable 'omit' from source: magic vars 30582 1726855394.19571: starting attempt loop 30582 1726855394.19574: running the handler 30582 1726855394.19582: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855394.19600: _low_level_execute_command(): starting 30582 1726855394.19606: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855394.20118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.20122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.20125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.20127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.20173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.20177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.20189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.20266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.21959: stdout chunk (state=3): >>>/root <<< 30582 1726855394.22059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.22090: stderr chunk (state=3): >>><<< 30582 1726855394.22093: stdout chunk (state=3): >>><<< 30582 1726855394.22114: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.22126: _low_level_execute_command(): starting 30582 1726855394.22131: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487 `" && echo ansible-tmp-1726855394.2211406-36119-70291773555487="` echo /root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487 `" ) && sleep 0' 30582 1726855394.22565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855394.22576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855394.22579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855394.22581: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.22583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.22629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.22636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.22693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.24624: stdout chunk (state=3): >>>ansible-tmp-1726855394.2211406-36119-70291773555487=/root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487 <<< 30582 1726855394.24726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.24754: stderr chunk (state=3): >>><<< 30582 1726855394.24757: stdout chunk (state=3): >>><<< 30582 1726855394.24776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855394.2211406-36119-70291773555487=/root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.24805: variable 'ansible_module_compression' from source: unknown 30582 1726855394.24849: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855394.24882: variable 'ansible_facts' from source: unknown 30582 1726855394.24937: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/AnsiballZ_command.py 30582 1726855394.25040: Sending initial data 30582 1726855394.25043: Sent initial data (155 bytes) 30582 1726855394.25491: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.25494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.25496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.25498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.25548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.25551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.25616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.27217: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 30582 1726855394.27222: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855394.27274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855394.27336: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp09pbp9os /root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/AnsiballZ_command.py <<< 30582 1726855394.27339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/AnsiballZ_command.py" <<< 30582 1726855394.27393: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp09pbp9os" to remote "/root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/AnsiballZ_command.py" <<< 30582 1726855394.27398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/AnsiballZ_command.py" <<< 30582 1726855394.27983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.28026: stderr chunk (state=3): >>><<< 30582 1726855394.28030: stdout chunk (state=3): >>><<< 30582 1726855394.28059: done transferring module to remote 30582 1726855394.28069: _low_level_execute_command(): starting 30582 1726855394.28073: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/ /root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/AnsiballZ_command.py && sleep 0' 30582 1726855394.28515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855394.28518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855394.28524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.28526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.28528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.28579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.28583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.28585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.28647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.30438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.30467: stderr chunk (state=3): >>><<< 30582 1726855394.30470: stdout chunk (state=3): >>><<< 30582 1726855394.30486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.30491: _low_level_execute_command(): starting 30582 1726855394.30495: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/AnsiballZ_command.py && sleep 0' 30582 1726855394.30935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.30938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.30941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855394.30943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855394.30945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.30993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.30997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.31006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.31068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.50009: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:03:14.465197", "end": "2024-09-20 14:03:14.499195", "delta": "0:00:00.033998", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855394.51579: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. <<< 30582 1726855394.51607: stderr chunk (state=3): >>><<< 30582 1726855394.51610: stdout chunk (state=3): >>><<< 30582 1726855394.51627: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 14:03:14.465197", "end": "2024-09-20 14:03:14.499195", "delta": "0:00:00.033998", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 30582 1726855394.51660: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855394.51672: _low_level_execute_command(): starting 30582 1726855394.51674: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855394.2211406-36119-70291773555487/ > /dev/null 2>&1 && sleep 0' 30582 1726855394.52134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855394.52137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855394.52139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30582 1726855394.52141: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855394.52143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.52202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.52209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.52211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.52272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.54156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.54184: stderr chunk (state=3): >>><<< 30582 1726855394.54188: stdout chunk (state=3): >>><<< 30582 1726855394.54205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.54211: handler run complete 30582 1726855394.54231: Evaluated conditional (False): False 30582 1726855394.54240: attempt loop complete, returning result 30582 1726855394.54243: _execute() done 30582 1726855394.54245: dumping result to json 30582 1726855394.54251: done dumping result, returning 30582 1726855394.54259: done running TaskExecutor() for managed_node3/TASK: Cleanup profile and device [0affcc66-ac2b-aa83-7d57-00000000299e] 30582 1726855394.54266: sending task result for task 0affcc66-ac2b-aa83-7d57-00000000299e 30582 1726855394.54366: done sending task result for task 0affcc66-ac2b-aa83-7d57-00000000299e 30582 1726855394.54370: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.033998", "end": "2024-09-20 14:03:14.499195", "rc": 1, "start": "2024-09-20 14:03:14.465197" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 30582 1726855394.54435: no more pending results, returning what we have 30582 1726855394.54440: results queue empty 30582 1726855394.54441: checking for any_errors_fatal 30582 1726855394.54442: done checking for any_errors_fatal 30582 1726855394.54443: checking for max_fail_percentage 30582 1726855394.54445: done checking for max_fail_percentage 30582 1726855394.54446: checking to see if all hosts have failed and the running result is not ok 30582 1726855394.54447: done checking to see if all hosts have failed 30582 1726855394.54447: getting the remaining hosts for this loop 30582 1726855394.54449: done getting the remaining hosts for this loop 30582 1726855394.54453: getting the next task for host managed_node3 30582 1726855394.54465: done getting next task for host managed_node3 30582 1726855394.54468: ^ task is: TASK: Check routes and DNS 30582 1726855394.54472: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855394.54477: getting variables 30582 1726855394.54479: in VariableManager get_vars() 30582 1726855394.54527: Calling all_inventory to load vars for managed_node3 30582 1726855394.54530: Calling groups_inventory to load vars for managed_node3 30582 1726855394.54533: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855394.54544: Calling all_plugins_play to load vars for managed_node3 30582 1726855394.54546: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855394.54548: Calling groups_plugins_play to load vars for managed_node3 30582 1726855394.55402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855394.56423: done with get_vars() 30582 1726855394.56442: done getting variables 30582 1726855394.56491: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 14:03:14 -0400 (0:00:00.384) 0:02:10.915 ****** 30582 1726855394.56514: entering _queue_task() for managed_node3/shell 30582 1726855394.56783: worker is 1 (out of 1 available) 30582 1726855394.56799: exiting _queue_task() for managed_node3/shell 30582 1726855394.56811: done queuing things up, now waiting for results queue to drain 30582 1726855394.56813: waiting for pending results... 30582 1726855394.56997: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 30582 1726855394.57072: in run() - task 0affcc66-ac2b-aa83-7d57-0000000029a2 30582 1726855394.57085: variable 'ansible_search_path' from source: unknown 30582 1726855394.57090: variable 'ansible_search_path' from source: unknown 30582 1726855394.57116: calling self._execute() 30582 1726855394.57195: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.57199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.57207: variable 'omit' from source: magic vars 30582 1726855394.57490: variable 'ansible_distribution_major_version' from source: facts 30582 1726855394.57500: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855394.57506: variable 'omit' from source: magic vars 30582 1726855394.57537: variable 'omit' from source: magic vars 30582 1726855394.57561: variable 'omit' from source: magic vars 30582 1726855394.57597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855394.57626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855394.57643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855394.57657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.57666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.57700: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855394.57703: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.57706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.57777: Set connection var ansible_timeout to 10 30582 1726855394.57780: Set connection var ansible_connection to ssh 30582 1726855394.57785: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855394.57791: Set connection var ansible_pipelining to False 30582 1726855394.57798: Set connection var ansible_shell_executable to /bin/sh 30582 1726855394.57801: Set connection var ansible_shell_type to sh 30582 1726855394.57819: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.57822: variable 'ansible_connection' from source: unknown 30582 1726855394.57825: variable 'ansible_module_compression' from source: unknown 30582 1726855394.57827: variable 'ansible_shell_type' from source: unknown 30582 1726855394.57830: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.57832: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.57835: variable 'ansible_pipelining' from source: unknown 30582 1726855394.57837: variable 'ansible_timeout' from source: unknown 30582 1726855394.57842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.57948: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855394.57958: variable 'omit' from source: magic vars 30582 1726855394.57966: starting attempt loop 30582 1726855394.57970: running the handler 30582 1726855394.57976: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855394.57996: _low_level_execute_command(): starting 30582 1726855394.58002: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855394.58525: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.58528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.58532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855394.58534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.58580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.58583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.58606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.58667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.60369: stdout chunk (state=3): >>>/root <<< 30582 1726855394.60471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.60501: stderr chunk (state=3): >>><<< 30582 1726855394.60504: stdout chunk (state=3): >>><<< 30582 1726855394.60525: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.60539: _low_level_execute_command(): starting 30582 1726855394.60543: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644 `" && echo ansible-tmp-1726855394.6052675-36127-33829090596644="` echo /root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644 `" ) && sleep 0' 30582 1726855394.60984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.60997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.61000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.61003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855394.61006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.61046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.61053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.61055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.61114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.63049: stdout chunk (state=3): >>>ansible-tmp-1726855394.6052675-36127-33829090596644=/root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644 <<< 30582 1726855394.63153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.63180: stderr chunk (state=3): >>><<< 30582 1726855394.63183: stdout chunk (state=3): >>><<< 30582 1726855394.63201: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855394.6052675-36127-33829090596644=/root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.63230: variable 'ansible_module_compression' from source: unknown 30582 1726855394.63277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855394.63310: variable 'ansible_facts' from source: unknown 30582 1726855394.63366: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/AnsiballZ_command.py 30582 1726855394.63476: Sending initial data 30582 1726855394.63479: Sent initial data (155 bytes) 30582 1726855394.63930: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855394.63934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855394.63936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.63939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855394.63941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.63995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.63998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.64012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.64065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.65666: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855394.65718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855394.65782: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxuh8bj3q /root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/AnsiballZ_command.py <<< 30582 1726855394.65789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/AnsiballZ_command.py" <<< 30582 1726855394.65844: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmpxuh8bj3q" to remote "/root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/AnsiballZ_command.py" <<< 30582 1726855394.65848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/AnsiballZ_command.py" <<< 30582 1726855394.66443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.66486: stderr chunk (state=3): >>><<< 30582 1726855394.66491: stdout chunk (state=3): >>><<< 30582 1726855394.66508: done transferring module to remote 30582 1726855394.66517: _low_level_execute_command(): starting 30582 1726855394.66523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/ /root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/AnsiballZ_command.py && sleep 0' 30582 1726855394.66970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855394.66973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855394.66979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.66981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.66983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855394.66985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.67032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.67037: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.67039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.67095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.68903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.68930: stderr chunk (state=3): >>><<< 30582 1726855394.68933: stdout chunk (state=3): >>><<< 30582 1726855394.68946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.68948: _low_level_execute_command(): starting 30582 1726855394.68954: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/AnsiballZ_command.py && sleep 0' 30582 1726855394.69381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855394.69384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.69386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 30582 1726855394.69390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.69393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.69439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.69442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.69515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.86024: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:88:11:da:7f:a3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.244/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2457sec preferred_lft 2457sec\n inet6 fe80::1088:11ff:feda:7fa3/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n35: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether aa:60:c4:d8:31:87 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.244 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.244 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 14:03:14.850205", "end": "2024-09-20 14:03:14.859306", "delta": "0:00:00.009101", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855394.87710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855394.87736: stderr chunk (state=3): >>><<< 30582 1726855394.87739: stdout chunk (state=3): >>><<< 30582 1726855394.87756: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:88:11:da:7f:a3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.244/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2457sec preferred_lft 2457sec\n inet6 fe80::1088:11ff:feda:7fa3/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n35: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether aa:60:c4:d8:31:87 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.244 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.244 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 14:03:14.850205", "end": "2024-09-20 14:03:14.859306", "delta": "0:00:00.009101", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855394.87803: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855394.87809: _low_level_execute_command(): starting 30582 1726855394.87814: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855394.6052675-36127-33829090596644/ > /dev/null 2>&1 && sleep 0' 30582 1726855394.88256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.88260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.88264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855394.88267: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.88269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.88320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.88325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.88331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.88390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.90265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.90292: stderr chunk (state=3): >>><<< 30582 1726855394.90295: stdout chunk (state=3): >>><<< 30582 1726855394.90308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.90315: handler run complete 30582 1726855394.90332: Evaluated conditional (False): False 30582 1726855394.90341: attempt loop complete, returning result 30582 1726855394.90344: _execute() done 30582 1726855394.90346: dumping result to json 30582 1726855394.90352: done dumping result, returning 30582 1726855394.90360: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0affcc66-ac2b-aa83-7d57-0000000029a2] 30582 1726855394.90367: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000029a2 30582 1726855394.90475: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000029a2 30582 1726855394.90479: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009101", "end": "2024-09-20 14:03:14.859306", "rc": 0, "start": "2024-09-20 14:03:14.850205" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:88:11:da:7f:a3 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.244/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2457sec preferred_lft 2457sec inet6 fe80::1088:11ff:feda:7fa3/64 scope link noprefixroute valid_lft forever preferred_lft forever 35: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether aa:60:c4:d8:31:87 brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.244 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.244 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 30582 1726855394.90547: no more pending results, returning what we have 30582 1726855394.90552: results queue empty 30582 1726855394.90553: checking for any_errors_fatal 30582 1726855394.90565: done checking for any_errors_fatal 30582 1726855394.90566: checking for max_fail_percentage 30582 1726855394.90568: done checking for max_fail_percentage 30582 1726855394.90569: checking to see if all hosts have failed and the running result is not ok 30582 1726855394.90570: done checking to see if all hosts have failed 30582 1726855394.90570: getting the remaining hosts for this loop 30582 1726855394.90572: done getting the remaining hosts for this loop 30582 1726855394.90576: getting the next task for host managed_node3 30582 1726855394.90584: done getting next task for host managed_node3 30582 1726855394.90586: ^ task is: TASK: Verify DNS and network connectivity 30582 1726855394.90592: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855394.90601: getting variables 30582 1726855394.90603: in VariableManager get_vars() 30582 1726855394.90647: Calling all_inventory to load vars for managed_node3 30582 1726855394.90649: Calling groups_inventory to load vars for managed_node3 30582 1726855394.90654: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855394.90666: Calling all_plugins_play to load vars for managed_node3 30582 1726855394.90669: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855394.90672: Calling groups_plugins_play to load vars for managed_node3 30582 1726855394.91515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855394.92391: done with get_vars() 30582 1726855394.92409: done getting variables 30582 1726855394.92452: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 14:03:14 -0400 (0:00:00.359) 0:02:11.274 ****** 30582 1726855394.92477: entering _queue_task() for managed_node3/shell 30582 1726855394.92725: worker is 1 (out of 1 available) 30582 1726855394.92738: exiting _queue_task() for managed_node3/shell 30582 1726855394.92751: done queuing things up, now waiting for results queue to drain 30582 1726855394.92753: waiting for pending results... 30582 1726855394.92937: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 30582 1726855394.93011: in run() - task 0affcc66-ac2b-aa83-7d57-0000000029a3 30582 1726855394.93021: variable 'ansible_search_path' from source: unknown 30582 1726855394.93025: variable 'ansible_search_path' from source: unknown 30582 1726855394.93054: calling self._execute() 30582 1726855394.93132: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.93135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.93144: variable 'omit' from source: magic vars 30582 1726855394.93426: variable 'ansible_distribution_major_version' from source: facts 30582 1726855394.93436: Evaluated conditional (ansible_distribution_major_version != '6'): True 30582 1726855394.93536: variable 'ansible_facts' from source: unknown 30582 1726855394.94013: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 30582 1726855394.94018: variable 'omit' from source: magic vars 30582 1726855394.94052: variable 'omit' from source: magic vars 30582 1726855394.94078: variable 'omit' from source: magic vars 30582 1726855394.94112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30582 1726855394.94140: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30582 1726855394.94155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30582 1726855394.94169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.94181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30582 1726855394.94206: variable 'inventory_hostname' from source: host vars for 'managed_node3' 30582 1726855394.94210: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.94212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.94284: Set connection var ansible_timeout to 10 30582 1726855394.94289: Set connection var ansible_connection to ssh 30582 1726855394.94292: Set connection var ansible_module_compression to ZIP_DEFLATED 30582 1726855394.94300: Set connection var ansible_pipelining to False 30582 1726855394.94302: Set connection var ansible_shell_executable to /bin/sh 30582 1726855394.94310: Set connection var ansible_shell_type to sh 30582 1726855394.94326: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.94329: variable 'ansible_connection' from source: unknown 30582 1726855394.94332: variable 'ansible_module_compression' from source: unknown 30582 1726855394.94334: variable 'ansible_shell_type' from source: unknown 30582 1726855394.94336: variable 'ansible_shell_executable' from source: unknown 30582 1726855394.94338: variable 'ansible_host' from source: host vars for 'managed_node3' 30582 1726855394.94341: variable 'ansible_pipelining' from source: unknown 30582 1726855394.94344: variable 'ansible_timeout' from source: unknown 30582 1726855394.94348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 30582 1726855394.94452: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855394.94461: variable 'omit' from source: magic vars 30582 1726855394.94467: starting attempt loop 30582 1726855394.94469: running the handler 30582 1726855394.94479: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30582 1726855394.94496: _low_level_execute_command(): starting 30582 1726855394.94504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30582 1726855394.95017: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855394.95021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.95024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 30582 1726855394.95027: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.95069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.95073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.95082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.95154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.96855: stdout chunk (state=3): >>>/root <<< 30582 1726855394.96951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.96981: stderr chunk (state=3): >>><<< 30582 1726855394.96984: stdout chunk (state=3): >>><<< 30582 1726855394.97009: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.97020: _low_level_execute_command(): starting 30582 1726855394.97026: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051 `" && echo ansible-tmp-1726855394.9700766-36135-32424980060051="` echo /root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051 `" ) && sleep 0' 30582 1726855394.97471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855394.97474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855394.97477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 30582 1726855394.97479: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855394.97481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855394.97536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855394.97542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855394.97544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855394.97608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855394.99572: stdout chunk (state=3): >>>ansible-tmp-1726855394.9700766-36135-32424980060051=/root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051 <<< 30582 1726855394.99793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855394.99797: stdout chunk (state=3): >>><<< 30582 1726855394.99800: stderr chunk (state=3): >>><<< 30582 1726855394.99802: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855394.9700766-36135-32424980060051=/root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855394.99805: variable 'ansible_module_compression' from source: unknown 30582 1726855394.99854: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30582qfa9_3j6/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30582 1726855394.99899: variable 'ansible_facts' from source: unknown 30582 1726855394.99997: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/AnsiballZ_command.py 30582 1726855395.00093: Sending initial data 30582 1726855395.00106: Sent initial data (155 bytes) 30582 1726855395.00531: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855395.00534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 30582 1726855395.00536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855395.00539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855395.00541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855395.00593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855395.00597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855395.00659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855395.02259: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 30582 1726855395.02266: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 30582 1726855395.02317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 30582 1726855395.02378: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp3cfhdx9y /root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/AnsiballZ_command.py <<< 30582 1726855395.02384: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/AnsiballZ_command.py" <<< 30582 1726855395.02439: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-30582qfa9_3j6/tmp3cfhdx9y" to remote "/root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/AnsiballZ_command.py" <<< 30582 1726855395.02445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/AnsiballZ_command.py" <<< 30582 1726855395.03041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855395.03089: stderr chunk (state=3): >>><<< 30582 1726855395.03093: stdout chunk (state=3): >>><<< 30582 1726855395.03128: done transferring module to remote 30582 1726855395.03136: _low_level_execute_command(): starting 30582 1726855395.03141: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/ /root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/AnsiballZ_command.py && sleep 0' 30582 1726855395.03595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855395.03604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855395.03606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855395.03609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30582 1726855395.03611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 30582 1726855395.03613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855395.03659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855395.03663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855395.03667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855395.03725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855395.05532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855395.05562: stderr chunk (state=3): >>><<< 30582 1726855395.05568: stdout chunk (state=3): >>><<< 30582 1726855395.05579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855395.05582: _low_level_execute_command(): starting 30582 1726855395.05589: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/AnsiballZ_command.py && sleep 0' 30582 1726855395.06025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855395.06028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855395.06030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855395.06033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855395.06073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855395.06085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855395.06159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855395.58737: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1384 0 --:--:-- --:--:-- --:--:-- 1386\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2339 0 --:--:-- --:--:-- --:--:-- 2346", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 14:03:15.216041", "end": "2024-09-20 14:03:15.586323", "delta": "0:00:00.370282", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30582 1726855395.60334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 30582 1726855395.60365: stderr chunk (state=3): >>><<< 30582 1726855395.60369: stdout chunk (state=3): >>><<< 30582 1726855395.60394: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1384 0 --:--:-- --:--:-- --:--:-- 1386\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2339 0 --:--:-- --:--:-- --:--:-- 2346", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 14:03:15.216041", "end": "2024-09-20 14:03:15.586323", "delta": "0:00:00.370282", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 30582 1726855395.60430: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30582 1726855395.60437: _low_level_execute_command(): starting 30582 1726855395.60442: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855394.9700766-36135-32424980060051/ > /dev/null 2>&1 && sleep 0' 30582 1726855395.60902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30582 1726855395.60906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30582 1726855395.60908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855395.60910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30582 1726855395.60912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30582 1726855395.60964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 30582 1726855395.60968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30582 1726855395.60970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30582 1726855395.61036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30582 1726855395.62895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30582 1726855395.62920: stderr chunk (state=3): >>><<< 30582 1726855395.62923: stdout chunk (state=3): >>><<< 30582 1726855395.62935: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30582 1726855395.62942: handler run complete 30582 1726855395.62963: Evaluated conditional (False): False 30582 1726855395.62973: attempt loop complete, returning result 30582 1726855395.62976: _execute() done 30582 1726855395.62978: dumping result to json 30582 1726855395.62984: done dumping result, returning 30582 1726855395.62994: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0affcc66-ac2b-aa83-7d57-0000000029a3] 30582 1726855395.62999: sending task result for task 0affcc66-ac2b-aa83-7d57-0000000029a3 30582 1726855395.63098: done sending task result for task 0affcc66-ac2b-aa83-7d57-0000000029a3 30582 1726855395.63100: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.370282", "end": "2024-09-20 14:03:15.586323", "rc": 0, "start": "2024-09-20 14:03:15.216041" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1384 0 --:--:-- --:--:-- --:--:-- 1386 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2339 0 --:--:-- --:--:-- --:--:-- 2346 30582 1726855395.63165: no more pending results, returning what we have 30582 1726855395.63170: results queue empty 30582 1726855395.63171: checking for any_errors_fatal 30582 1726855395.63183: done checking for any_errors_fatal 30582 1726855395.63183: checking for max_fail_percentage 30582 1726855395.63185: done checking for max_fail_percentage 30582 1726855395.63186: checking to see if all hosts have failed and the running result is not ok 30582 1726855395.63189: done checking to see if all hosts have failed 30582 1726855395.63190: getting the remaining hosts for this loop 30582 1726855395.63191: done getting the remaining hosts for this loop 30582 1726855395.63199: getting the next task for host managed_node3 30582 1726855395.63213: done getting next task for host managed_node3 30582 1726855395.63215: ^ task is: TASK: meta (flush_handlers) 30582 1726855395.63217: ^ state is: HOST STATE: block=9, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855395.63221: getting variables 30582 1726855395.63223: in VariableManager get_vars() 30582 1726855395.63266: Calling all_inventory to load vars for managed_node3 30582 1726855395.63269: Calling groups_inventory to load vars for managed_node3 30582 1726855395.63272: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855395.63283: Calling all_plugins_play to load vars for managed_node3 30582 1726855395.63286: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855395.63294: Calling groups_plugins_play to load vars for managed_node3 30582 1726855395.64305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855395.65146: done with get_vars() 30582 1726855395.65165: done getting variables 30582 1726855395.65216: in VariableManager get_vars() 30582 1726855395.65226: Calling all_inventory to load vars for managed_node3 30582 1726855395.65227: Calling groups_inventory to load vars for managed_node3 30582 1726855395.65229: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855395.65232: Calling all_plugins_play to load vars for managed_node3 30582 1726855395.65235: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855395.65237: Calling groups_plugins_play to load vars for managed_node3 30582 1726855395.65868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855395.66727: done with get_vars() 30582 1726855395.66749: done queuing things up, now waiting for results queue to drain 30582 1726855395.66751: results queue empty 30582 1726855395.66752: checking for any_errors_fatal 30582 1726855395.66755: done checking for any_errors_fatal 30582 1726855395.66755: checking for max_fail_percentage 30582 1726855395.66756: done checking for max_fail_percentage 30582 1726855395.66756: checking to see if all hosts have failed and the running result is not ok 30582 1726855395.66757: done checking to see if all hosts have failed 30582 1726855395.66757: getting the remaining hosts for this loop 30582 1726855395.66758: done getting the remaining hosts for this loop 30582 1726855395.66760: getting the next task for host managed_node3 30582 1726855395.66763: done getting next task for host managed_node3 30582 1726855395.66764: ^ task is: TASK: meta (flush_handlers) 30582 1726855395.66765: ^ state is: HOST STATE: block=10, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855395.66767: getting variables 30582 1726855395.66768: in VariableManager get_vars() 30582 1726855395.66777: Calling all_inventory to load vars for managed_node3 30582 1726855395.66778: Calling groups_inventory to load vars for managed_node3 30582 1726855395.66779: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855395.66785: Calling all_plugins_play to load vars for managed_node3 30582 1726855395.66786: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855395.66790: Calling groups_plugins_play to load vars for managed_node3 30582 1726855395.67471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855395.68300: done with get_vars() 30582 1726855395.68314: done getting variables 30582 1726855395.68348: in VariableManager get_vars() 30582 1726855395.68356: Calling all_inventory to load vars for managed_node3 30582 1726855395.68358: Calling groups_inventory to load vars for managed_node3 30582 1726855395.68359: Calling all_plugins_inventory to load vars for managed_node3 30582 1726855395.68362: Calling all_plugins_play to load vars for managed_node3 30582 1726855395.68364: Calling groups_plugins_inventory to load vars for managed_node3 30582 1726855395.68367: Calling groups_plugins_play to load vars for managed_node3 30582 1726855395.68996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30582 1726855395.69945: done with get_vars() 30582 1726855395.69964: done queuing things up, now waiting for results queue to drain 30582 1726855395.69965: results queue empty 30582 1726855395.69966: checking for any_errors_fatal 30582 1726855395.69967: done checking for any_errors_fatal 30582 1726855395.69967: checking for max_fail_percentage 30582 1726855395.69968: done checking for max_fail_percentage 30582 1726855395.69968: checking to see if all hosts have failed and the running result is not ok 30582 1726855395.69969: done checking to see if all hosts have failed 30582 1726855395.69969: getting the remaining hosts for this loop 30582 1726855395.69970: done getting the remaining hosts for this loop 30582 1726855395.69972: getting the next task for host managed_node3 30582 1726855395.69974: done getting next task for host managed_node3 30582 1726855395.69975: ^ task is: None 30582 1726855395.69976: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30582 1726855395.69977: done queuing things up, now waiting for results queue to drain 30582 1726855395.69977: results queue empty 30582 1726855395.69978: checking for any_errors_fatal 30582 1726855395.69978: done checking for any_errors_fatal 30582 1726855395.69979: checking for max_fail_percentage 30582 1726855395.69979: done checking for max_fail_percentage 30582 1726855395.69980: checking to see if all hosts have failed and the running result is not ok 30582 1726855395.69980: done checking to see if all hosts have failed 30582 1726855395.69982: getting the next task for host managed_node3 30582 1726855395.69984: done getting next task for host managed_node3 30582 1726855395.69984: ^ task is: None 30582 1726855395.69985: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=334 changed=10 unreachable=0 failed=0 skipped=312 rescued=0 ignored=10 Friday 20 September 2024 14:03:15 -0400 (0:00:00.775) 0:02:12.050 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.13s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.08s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.87s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.86s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.85s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.83s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.82s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.80s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.75s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.73s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.73s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.73s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.72s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.16s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.11s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 1.10s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 30582 1726855395.70195: RUNNING CLEANUP